The curious case of the GeneratedMethodAccessor7
… or how I learned not to run spring cloud config client and tinyproxy with their respective default ports at the same time.
tl;dr§
When things start to take long all of a sudden, it's because of network timeouts. Always. Skip the "almost".
slighly longer version§
Problem§
My spring boot apps started to show symptoms of serious retardation. More precisely, the jvm seemed to do absolutely nothing for a long time before even attempting to start the app. Looking for example like this:
java -jar build/libs/app.jar
# [Just nothing...]
# [Still nothing...]
# [Nope, nothig yet...]
# ...
# [Finally some logs, when the application actually starts]
It took me an embarrassingly long time to figure out what was going on. I thought it must have had something to do with my recent switch to java 17. Or maybe my SSD just ran out of good blocks, severely slowing down IO? No, of course not!
Surely my OS was running low on entropy! (Didn't know this is a thing? It is.)
I ran the JVM with -verbose
option to see what it was up to. I could see it
was taking more than two minutes to load the class
jdk.internal.reflect.GeneratedMethodAccessor7
from source __ClassDefiner__
.
And then another 40 seconds to load java.util.IdentityHashMap$KeyIterator
from
source shared objects file
:
[...]
[8,994s][info][class,load] jdk.internal.reflect.GeneratedMethodAccessor7 source: __ClassDefiner__
[141,991s][info][class,load] java.util.IdentityHashMap$KeyIterator source: shared objects file
[186,389s][info][class,load] java.io.InterruptedIOException source: jrt:/java.base
[...]
I do feel the urge to make something up about how I fired up advanced tracing tools and whatnot. But the reality is, after some days of just doing other stuff ('cause aint nobody got time to debug the JVM), I found the culprit by chance.
Revelation§
While tail
ing my tinyproxy logs (which I was
running for a totally unrelated project), I found unexpected requests for
localhost:8888
. This host/port combination shouldn't have showed up in the
logs at all – the proxy was exclusively configured in one (non jvm/spring
app) I was working on that was really not supposed to make such requests.
Even with that hint, it took me a while to realize two things:
1. Spring cloud config client§
We use spring cloud config to
configure parts of our apps during production. But by default, our apps start
"ready for development" and omit cloud configuration. We do this by setting the
spring.config.import
property to optional:configserver:
(as per
the documentation). This will (normally) lead the config
client to make a request to the default target (localhost:8888
), receive a
"connection refused", and be done with it.
2. Tinyproxy§
Guess what? Tinyproxy defaults to listening on port 8888.
Conclusion§
Do you know what happenes if you make request for localhost:8888
and there's
a proxy running on that port? I can't tell you exactly, but I definately know it
will lead to all kinds of shenanigans, timeouts and probably the death of your
proxy while it tries to pass the request to itself in an endless loop.