Quite a lot, as it turns out. The ohms resistance/meter of AWG 10 gauge wire is .00328. Current squared times resistance equals watts of heat. 90 amps flowing through 10 gauge wire will generate about 26 watts, equal to the heat output of a refrigerator light, which would trip a 30 amp breaker in 30 seconds or so absent a fault to ground. I assume the damage we see in the picture isn't the result of 30 seconds exposure to a refrigerator light.
Run the same amount of current through a 40 gauge wire with resistance of 3.44 and you can generate about 28,000 watts of heat, which is why if I were to explain what happened here, I would ask the manufacturer what the purpose of the "trip thermostat" they refer to is, since it doesn't appear to do what they say it will do, i.e. provide "Overload Protection - internal thermostat cuts power at the first sign of overheating." Having seen it before I would look to whatever circuitry is involved in the "trip thermostat" since like many electronic components, it ain't using 10 gauge wire.
Not trying to pick on the manufacturer - just questioning the wisdom of placing circuitry that will melt down when exposed to an amp or so of current with a 30 amp breaker protecting it.