I'm considering getting into NLP research (that is AI research into Transformers specifically) but my friend made this point which got me concerned, "he said NLP research is dead b/c of ChatGPT". We all know ChatGPT is enormous and highly versatile. Is it realistic for NLP researchers to make meaningful contributions without having the resources of: google, amazon, ivy leagues, facebook, OpenAI/Microsoft etc...
For example is it meaningful to compete with models in certain "resource classes", e.g. trying to get the best BERT model given a fixed amount of compute or parameters?
P.S. For example recently saw the Vega2 paper 2nd place on SuperGlue leader board used 320 A100s for > 30 days. Looking at rental prices online that should cost about $300,000!