We have started working on Apache Spark since 2014 (before Shark joined Apache Spark that formed part of Spark SQL).
We have helped end users to deploy and manage multiple Apache Spark clusters start from 3 nodes to 100+ nodes, to process data of hundred TBs to PBs.
We run Apache Spark clusters with Apache Hadoop and Apache HBase on cloud or on modern bare metal industrial standard high-end servers.
We also run Apache Spark clusters alone as computation cluster with NVIDIA GPUs on modern bare metal high-end servers, to gain the full power of the machines.
We provide CPFA training on Apache Airflow, Apache Hadoop, Apache Hbase, Apache Hive, and Apache Spark in local languages in Asia.
ACT NOW! SEATS ARE LIMITED IN EACH MONTH!