Machine Learning & Cognitive Computing

Over the last decade our teams have worked on various big data technologies specifically in Healthcare domain. IBOTSystems has been one of the first companies in Atlanta to create from scratch a big data infrastructure using commodity hardware and Apache Hadoop Stack.

We have also worked extensively on large scale processing of both structured and unstructured data using tools like Elasticsearch, Apache Storm and MongoDB.

We have developed traditional data warehouse, dimensional models using data from both structured & unstructured big data stores and using tools/db’s like PostgreSQL, Pentaho Data Integrator, JasperServer and Tableau. We have experience in developing sophisticated relevancy engines over structured and unstructured data.

We have helped our customers/partners move up in the data food chain. Some of our customers have evolved from consuming structured and unstructured data to creating their own data using machine learning algorithms.

Cognitive computing is the next level of R&D work that is happening within IBOTSystems. These include a class of problems where there is no clear answers and lots of ambiguity, that are traditionally handled by human brain.

Skills / Expertise
Apache Hadoop & all related projects Elasticsearch Apache Storm
MongoDB PostgreSQL Cassandra
Apache Solr Pentaho Data Integrator JasperServer
Tableau MySQL Apache Kafka
Apache Nutch Cloudera, Hortonworks Hadoop distributions
Research
IBM Watson Apache UIMA JAGA