Make the model max request size configurable
Our models are set up to take in real time data. Unfortunately, the format in which the data is sent to our models is somewhat beyond our control. Additionally, sending a pointer to a data store would most likely increase the time to service/return a score request, which we can't afford from a latency standpoint. Our ideal solution here would be an increase to the nginx limit.