Replying here to the two siblings comments being confused about the decoder example.
What tialaramex is saying, is that if you have a stream of JSON values, you create a JSON decoder over it. Then every time you call the decode() method, you get the next decoded JSON value.
Then you want to process the JSON values concurrently.
Rephrased, the question was what would happen if you were to have every concurrent task call the decode() method whenever it wants a new value to work on?
It would probably be a data race cluster fuck. But you might find this type of mistakes everywhere in Go. I myself fought things like that in many libraries.
One such occurrence I recall was in the Google Cloud Pub Sub client library. It basically did something similar to this example. Trying to offer concurrency over a stream of messages. It would fail very rarely. And pretty much always passe the race detector. It wasn't fun to debug.
What tialaramex is saying, is that if you have a stream of JSON values, you create a JSON decoder over it. Then every time you call the decode() method, you get the next decoded JSON value.
Then you want to process the JSON values concurrently.
Rephrased, the question was what would happen if you were to have every concurrent task call the decode() method whenever it wants a new value to work on?
It would probably be a data race cluster fuck. But you might find this type of mistakes everywhere in Go. I myself fought things like that in many libraries.
One such occurrence I recall was in the Google Cloud Pub Sub client library. It basically did something similar to this example. Trying to offer concurrency over a stream of messages. It would fail very rarely. And pretty much always passe the race detector. It wasn't fun to debug.