The term ‘Big Data’ is being increasingly used across multiple industries, but there hasn’t really been much of an explanation about what it actually means, as the term has different interpretations for the various parts of the industry.
At IRG, we are just starting to learn how Big Data may be used to benefit interference mitigation. This is an enormous concept, as the name denotes, but if we can learn how to use it properly, collect the data and ask the correct questions of that data, we can use it to make decisions, solve problems and educate users.
To obtain a better idea of what Big Data means we need look no further than the Internet and the billions of people—and things—that are connected to it. Data is literally everywhere, to a point where it can seem quite overwhelming. By 2020, the United Nations Economic Commission for Europe forecasts that the amount of data on a global scale will reach approximately 40 Zettabytes. Imagine using this data with new processing techniques so that we can solve future interference scenarios at a faster rate than has been possible in the past.
What if, through predictive analytics and being able to connect the dots quickly, value can be extracted from that data? The more we can quickly and effectively analyze the data, accuracy is gained and used to evolve decision-making and problem solving. There is so much data out there so, we should start to use it in a positive way.
Extracting Value From Big Data
First of all, core and relevant data must be collected. Our industry probably has such information securely locked away and in a variety of different formats that make this idea difficult at this point in time, especially in the case of a subject such as interference. However, it is the first step and the collection of data will be a continuous process and automation will also play its role.
Big Data allows prediction and solution through statistical analysis and we can learn so much from the data that we have access to through a technique called Deep Learning. Deep Learning is an emerging technique that, when applied, enables computers to identify items of interest from large quantities of data and to then identify relationships between that data—the techniques that may then be applied are able to extract meaningful information.
What does this mean for satellite interference, and how does Big Data fit in?
IRG is highly conscious of this shift to Big Data and has also recognized the benefits that such can offer to the satellite industry, as it has already accomplished in multiple industries. It is important that IRG doesn’t ‘miss the boat’ in terms of taking advantage of the positive effect that Big Data analytics could potentially have on the problem of satellite interference, but we need to start working on this in earnest—now.
The frustrating issue of satellite interference is something that still causes problems to users of satellite systems. However, this is also an issue that is being aggressively tackled by both industry bodies and companies. Highly experienced engineers continue to work tirelessly to make the business of identifying and eradicating interferers a great deal easier than has been managed in the past. Interference is often completely unnecessary, yet holds consequences for the industry in terms of disruption and money.
Pinpointing the source of the problem means a sharp increase in using resources and personnel, and excessive pressure to quickly resolve the problem. The vast majority of instances of interference come down to improper installation of ground equipment, lack of training, poor ground equipment manufacturing, and a lack of adherence to industry standards and guidelines. However, as we have seen, specifically in recent times, political motivation can also lead to incidents of malicious jamming—and this is extremely challenging to deal with.
Could analysis of all satellite interference and additional relevant data that probably exists within the Internet help us to fundamentally address the interference issue? If we retain every statistic, incident and detail of satellite interference, data storage will gown and we can then apply Deep Learning methods to that information in order to help us to predict and resolve future incidents and potentially stop them from occurring in the first place.
By collecting such statistics, adding the analysis from interfering signal characteristics to the Data store, certain “signatures” could be extracted that could lead to possible auto-classification of interference types and better user-friendly tools to progress our mission of mitigating interference.
IRG has started to build a repository of these basics, such as articles, statistics and presentations, that inform us about satellite interference. Granted, a small start; however, everything the group has presented since 2011 (and some from earlier events) is now located in one place and the data continues to grow. This can be found via the IRG website (satirg.org) or directly at http://data.satirg.org.
This is just the beginning and there is a great deal that the industry must learn before it can begin to employ Deep Learning to any Data. At Satellite 2016, IRG will be participating on a panel that will ask the question “Can Big Data Tackle Satellite Interference Challenges?” Industry discussions such as this one are vitally important so that we may address how we move forward and whether Big Data can actually be a pivotal tool in solving the issues of interference.
Deep Learning and predictive analysis are evolving rapidly and, as our understanding develops, is set to play a much deeper role across all industries. It is important that the satellite sector seize upon this opportunity, as it could provide yet another critical solution to the industry-wide challenge that is constantly presented by interference.
“Can Big Data Tackle Satellite Interference Challenges?” takes place on Wednesday, March 9, at 2:15 p.m. in Room Annapolis 1-2.