Big data is being collected almost everywhere: weather data, especially for airline transportation, can be collected every microsecond or even less as there reliable and cheap hardware technology to do so. You might have heard of algorithmic trading where computers trade against each other, of facilitate humans to do so, by generating data every microsecond or even less than that. These large scale data sets contain clues as to why otherwise stable and well-monitored systems (and clever traders and meteorologists) turn volatile and take risks where none is required.
Such big data is very noisy and has to be curated – spurious data and monitoring failures are two principal reasons – that is cleaned and then reformatted for the use systems that can detect the extent of volatilities. This information about the onset of volatility can be fed to airline pilots and market traders alike.
You will have to watch against latency issues – the time taken to curate, estimate and generate information, has to be much less than the time for the next lot of data to come in.
Last year students have developed such systems and we will encourage you to continue and refine and innovate on that work.
There is a paper I had written recently that may be of some help (and I hope to finish one on my more recent work in this area)
Wang, J., Bagla, T., Srivastava, S., Teehan, A., & Ahmad, K. (2022). Market Movements at High Frequencies and Latency in Response Times. In Proceedings of the Future Technologies Conference (FTC) 2021, Volume 1 (pp. 943-961). Springer International Publishing. (https://link.springer.com/chapter/10.1007/978-3-030-89906-6_60)