Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

Data Analytics Software for Automatic Detection of Anomalies in Well Testing

View through CrossRef
Abstract This paper will present a software that was developed to diagnose well test data. The software monitors the data, and through a series of algorithms alarms the user in case of discrepancies. This allows the user to investigate possible source of errors and correct them in real time. Several datasets from previous operations were analyzed and the basic physics governing how a certain datum depends on others were laid out. All the well test data traditionally acquired were put on a matrix, showing the dependencies between each datum and other physical properties that are available - either measured or modelled. Acceptable fluctuations in acquired data were also identified for use as tolerance limits. The software scans through the data as it is acquired and raises an alarm when the identified dependencies are broken. The software also identified which parameter is most likely causing the error. The software was built based on previous well test data and reports. Subsequently, two field trials were conducted to fine tune the algorithms and allowable data fluctuations. The process of validating the software consisted of: (1) Identifying flagged errors that should have not been flagged (dependencies set too tight); (2) identifying errors that should have been flagged and were not (dependencies set too loose); (3) improving the user interface for ease of use. The results were positive, with several improvements in the error recognition and several discrepancies flagged that would not have been caught by the naked eye. The user interface was also improved, allowing the user to clear error messages and provide input to improve the algorithm. The field trial also demonstrated that the methodology is scalable to other data acquisition plans and to more advanced analytics. The algorithms are simple, allowing the software to be implemented in all operations. More advanced algorithms are likely to depend on job specific data and parameters. Traditional data acquisition systems used during well test only present the data. Alarms trigger the user's attention only when certain defined operability limits are about to be reached. Being able to confirm that the data is cohesive during the well test prevents a loss of confidence in the results and painful post processing exercises. Moreover, given the algorithms used are based on simple physics, it is easy to deploy the software in any operation.
Title: Data Analytics Software for Automatic Detection of Anomalies in Well Testing
Description:
Abstract This paper will present a software that was developed to diagnose well test data.
The software monitors the data, and through a series of algorithms alarms the user in case of discrepancies.
This allows the user to investigate possible source of errors and correct them in real time.
Several datasets from previous operations were analyzed and the basic physics governing how a certain datum depends on others were laid out.
All the well test data traditionally acquired were put on a matrix, showing the dependencies between each datum and other physical properties that are available - either measured or modelled.
Acceptable fluctuations in acquired data were also identified for use as tolerance limits.
The software scans through the data as it is acquired and raises an alarm when the identified dependencies are broken.
The software also identified which parameter is most likely causing the error.
The software was built based on previous well test data and reports.
Subsequently, two field trials were conducted to fine tune the algorithms and allowable data fluctuations.
The process of validating the software consisted of: (1) Identifying flagged errors that should have not been flagged (dependencies set too tight); (2) identifying errors that should have been flagged and were not (dependencies set too loose); (3) improving the user interface for ease of use.
The results were positive, with several improvements in the error recognition and several discrepancies flagged that would not have been caught by the naked eye.
The user interface was also improved, allowing the user to clear error messages and provide input to improve the algorithm.
The field trial also demonstrated that the methodology is scalable to other data acquisition plans and to more advanced analytics.
The algorithms are simple, allowing the software to be implemented in all operations.
More advanced algorithms are likely to depend on job specific data and parameters.
Traditional data acquisition systems used during well test only present the data.
Alarms trigger the user's attention only when certain defined operability limits are about to be reached.
Being able to confirm that the data is cohesive during the well test prevents a loss of confidence in the results and painful post processing exercises.
Moreover, given the algorithms used are based on simple physics, it is easy to deploy the software in any operation.

Related Results

Predictive Analytics with Data Visualization
Predictive Analytics with Data Visualization
There has been tremendous growth for the need of analytics and BI tools in every organization, in every sector such as finance, software, medicine and even astronomy in order to be...
Software Protection
Software Protection
ABSTRACT : Software piracy has been major issue for software industries. Piracy has become so prevalent over the Internet that poses a major threat to software product companies. W...
Anomaly Detection Using Puzzle-Based Data Augmentation to Overcome Data Imbalances and Deficiencies
Anomaly Detection Using Puzzle-Based Data Augmentation to Overcome Data Imbalances and Deficiencies
Machine tools are used in a wide range of applications, and they can manufacture workpieces flexibly. Furthermore, they require maintenance; the overall costs include maintenance c...
Temporal integration of monaural and dichotic frequency modulation
Temporal integration of monaural and dichotic frequency modulation
Frequency modulation (FM) detection at low modulation frequencies is commonly used as an index of temporal fine structure processing to demonstrate age- and hearing-related deficit...
Winter‐to‐winter recurrence and non‐winter‐to‐winter recurrence of SST anomalies in the central North Pacific
Winter‐to‐winter recurrence and non‐winter‐to‐winter recurrence of SST anomalies in the central North Pacific
All previous studies of the winter‐to‐winter recurrence (WWR) of sea surface temperature anomalies (SSTA) have focused on mean climatic characteristics. Here, interannual variabili...
DATA-ANALYTICS
DATA-ANALYTICS
ABSTRACT– The exponential growth of data in the modern world presents both challenges and opportunities for researchers. This paper explores the field of data analytics, focusing o...
Fusion of Machine learning for Detection of Rumor and False Information in Social Network
Fusion of Machine learning for Detection of Rumor and False Information in Social Network
In recent years, spreading social media platforms and mobile devices led to more social data, advertisements, political opinions, and celebrity news proliferating fake news. Fake n...
Abstract POSTER-TECH-1131: Next generation protein multiplexing
Abstract POSTER-TECH-1131: Next generation protein multiplexing
Abstract While a variety of technologies exist to measure proteins, such as ELISA’s, Mass Spec, and 2D Gel’s, the most promising for screening multiple proteins are ...

Back to Top