Some old sage: when you're relatively to new to computers, you don't understand the difference between an interpreter and a compiler. After a while, you see the difference, and it's substantial and important. After a while longer, you don't see the difference anymore, though for different reasons than at first.
A not-so-nice thing about GDB is that if you order it to disassemble a program with slightly unusual/wrong ELF headers, it (the GDB, not the program) can segfault which is quite an unreasonable thing for a debugger to do IMHO.
To make things a bit more clear (hope that's what you meant!):
One concrete angle is that interpreters and compilers are usually intertwined: interpreters commonly have an abstract machine which is some language a bit simpler to execute than the full language, thus they have a compilation step at first. And compilers do stuff like constant propagation, inlining and other flavours of partial evaluation. Thus they contain some sorts of interpreters.
Another more abstract angle is to consider interpreters as compilers into a "trivial" language consisting only of values and side effects or the flip side: considering compilers as interpreters with a non-standard semantic.