Thinking out loud, here’s one idea for an LLM-assisted interview:
- Spin up a Digital Ocean droplet
- Add the candidate’s SSH key
- Have them implement a basic API. It must be publicly accessible.
- Connect the API to a database. Add more features.
- Set up a basic deployment pipeline. Could be as simple as script that copies the code from your local machine to the server.
Anything would be fair game. The goal would be to see how the candidate converses with the LLM, how they handle unexpected changes, and how they make decisions.
- Spin up a Digital Ocean droplet
- Add the candidate’s SSH key
- Have them implement a basic API. It must be publicly accessible.
- Connect the API to a database. Add more features.
- Set up a basic deployment pipeline. Could be as simple as script that copies the code from your local machine to the server.
Anything would be fair game. The goal would be to see how the candidate converses with the LLM, how they handle unexpected changes, and how they make decisions.
Just a thought.