Today’s companies are dealing with unstructured and unrelated data emerging from different channels thanks to the proliferation of data sources and, as a result are not able to fully exploit its potential. The challenge for the understanding of this large amount of data is of course the development of relevant financial and statistical models and also IT infrastructures fast enough and scalable enough to acquire, manage and treat the information.
ASPertise’s Big Data team is knowledgeable in the latest tools and technologies useful to develop systems and models to extract significance from our clients’ data. Some of our team members have, in addition to their computer expertise, a specific knowledge and education in statistics and mathematical tools very useful to develop relevant data models. Our team is also experienced and knowledgeable in the development of software solutions based on real time, scalable solution architectures. They have acquired and developed knowledge in NoSQL database (like MongoDB) and in tools such as Hbase, Hive, and Hadoop.
Some of the examples of development work done by our team are :
- High throughput data treatment for financial institutions
- High availability systems development for financial market data
- Maintenance of a high throughput / high visibility system in java
- Development of machine learning based system for data mining
- Prototyping of models and algorithms in Matlab
- Server based data analysis for clustering classification and regression
- Creation and architecture of high performance servers
- Development of real time systems for the filtering of GPS data