i’m starting to realize that if i’m being honest with myself, i was never truly taught anything about christianity as a child. i got the sunday school “jesus loves you! instead of teaching you about the gospels (y’know, the main part of our religion), we’re gonna do a word search about the animals on noah’s ark, because we want you to have fun!” i mean, yeah, i learned things, but even in “adult” services, it was watered down and to be honest, kind of unimpressive. i only believed because i was supposed to, and it took stepping away for a year or two to actually understand anything.