Mostly, I was thinking about how as far as I'm concerned, there's no real point in waiting for a civilization to get to a certain point in their development. In fact, it would actually be detrimental for that to happen. This screams counterintuitive in the face of all our given science fiction material, but my justifications are rather simple.
First, we develop concepts. "We have this concept, X. It works this way."
Someone else more advance might tell us, "Actually, X works this way." using an entirely different chain of thought, justifying the stance: if we can't understand that much, then clearly, we're idiots, right?
...Except, that's wrong. A better way to think about it would be this. "We have this concept, X, and we've thought of it working this way. You have this idea it works this way instead. But, tying X with Y, maybe they're related?" Basically, maybe this explanation that to them is really complex, can be boiled down into simple terms. Or vice-versa, complex things for us can be understood simply to them.
In short, the cultural exchange of knowledge from different perspectives, from different angles, would actually enhance both, since ignoring one in favor of the other is stupid. Which, brings me to the next part, the second thing: we develop beliefs. "Oh, clearly, they're to primitive to handle this knowledge, they'll just use it to destroy."
Not really, no.
I mean...if the technology is easy to accidentally screw up, then sure. (We're talking like, "nuclear bomb which you can just press a button to activate and instantly blow stuff up" levels of easy-to-accidentally-screw-up.) Some caution would be needed then.
...But if the technology is like that, who's to say we wouldn't be able to idiot-proof it? There's always the chance that by introducing technology meant to be used one way, it'll end up being used in a different way. But how is this an inherently negative thing? Because of some concept of "higher morality"?
Our way wouldn't be their way. But it's pure arrogance to think one way is inferior to another. Our history alone is ripe with examples of this. Basically, the counter-response of, "Yes, you are, it's going to happen", while also holding some level of merit (see also: our history, where rival tribes would destroy themselves by getting exploited), overall, doesn't hold true for me:
It's a risk, yes, but also carries with it a reward. There's always the chance things get worse, there's always the chance that potential for individual growth gets snuffed out by cultural contamination (different technologies being developed), but there's also the VAST potential for things to improve, for ideas that were thought about to be questioned on both sides, for new takes on old things, and that seems like it'd be something worth it.
In short, this would be me throwing out the idea of something like Star Trek's prime directive, and encouraging: interact all they like. Don't influence (you want them to develop independently, not become a carbon copy), but contact can happen at any time.