UrbanPro
true
Ramu J Big Data trainer in Hyderabad
Referral Discount: Get ₹ 250 off when you make a payment to start classes. Get started by Booking a Demo.

Details verified of Ramu J

Identity

Education

Know how UrbanPro verifies Tutor details

Identity is verified based on matching the details uploaded by the Tutor with government databases.

Overview

Over 8 years of professional IT experience with 4+ years of experience in Big Data and Hadoop eco-system along with Spark.Experience in requirement gathering, designing, developing, testing, implementing and maintaining systems. Experience in all phases of the Software Development Life Cycle(SDLC). Expertise in Hadoop architecture and its various components â?? Hadoop File System HDFS, MapReduce, Name Node, Data Node, Job Tracker, Task Tracker, Secondary Name Node and YARN. Expertise in developing and implementing big data solutions and data mining applications on Hadoop using Hive Hive2, PIG, Spark, Sqoop, Impala, HUI ,HBase and Oozie workflows. Expertise in working with HUI (Hadoop user interface) and used to develop project and testing. Expertise in Hadoop testing by using the Hadoop user interface.
Having POC Experience in Spark with Single RDD, Pair RDD and DStream .
Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files.
Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS.
Developed Avro Schema to create the Avro and parquet tables in the hive by using the Avro schema URL.
Good Experience in working with SerDes like Avro Format, Parquet format data.
Good Experience in developing a report by using hive Queries, hive UDFs and also prepared Pig Sc

Languages Spoken

Telugu

Tamil

Hindi

English

Education

dravidian university 2009

Master of Computer Applications (M.C.A.)

Address

Sanjeeva Reddy Nagar, Hyderabad, India - 500038

Verified Info

ID Verified

Phone Verified

Email Verified

Facebook Verified

Report this Profile

Is this listing inaccurate or duplicate? Any other problem?

Please tell us about the problem and we will fix it.

Please describe the problem that you see in this page.

Type the letters as shown below *

Please enter the letters as show below

Teaches

Big Data Training
6 Students

Class Location

Online (video chat via skype, google hangout etc)

Student's Home

Tutor's Home

Years of Experience in Big Data Training

15

Big Data Technology

Hadoop

Teaching Experience in detail in Big Data Training

 Overall, 14 Years of experience in the fields of Big Data / BI and GCP  Certified Google Cloud Professional cloud Architect with 3 years of experience  Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage  Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB  Completed Azure DP-900 certification  Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage  Experience in Data Validation automation tool  Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools  Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS  Experience in preparing Test Strategy, Test Plan and Test estimation  Worked in Agile and Waterfall models  Good knowledge in good automation tools  Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code.  Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data.  Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture  Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs  Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows,  Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub.  Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling  Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing.  Experience in optimizing ETL workflows.  Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API.  Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability.  Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts.  Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries.  Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend.  Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS.  Good Experience in working with SerDe’s like Avro Format, Parquet format data.

Google Cloud Platform

Class Location

Online (video chat via skype, google hangout etc)

Student's Home

Tutor's Home

Years of Experience in Google Cloud Platform

15

Teaching Experience in detail in Google Cloud Platform

 Overall, 14 Years of experience in the fields of Big Data / BI and GCP  Certified Google Cloud Professional cloud Architect with 3 years of experience  Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage  Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB  Completed Azure DP-900 certification  Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage  Experience in Data Validation automation tool  Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools  Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS  Experience in preparing Test Strategy, Test Plan and Test estimation  Worked in Agile and Waterfall models  Good knowledge in good automation tools  Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code.  Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data.  Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture  Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs  Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows,  Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub.  Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling  Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing.  Experience in optimizing ETL workflows.  Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API.  Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability.  Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts.  Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries.  Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend.  Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS.  Good Experience in working with SerDe’s like Avro Format, Parquet format data.

Python Training classes

Class Location

Online (video chat via skype, google hangout etc)

Student's Home

Tutor's Home

Years of Experience in Python Training classes

15

Course Duration provided

1-3 months

Seeker background catered to

Educational Institution, Individual, Corporate company

Certification provided

Yes

Python applications taught

PySpark

Teaching Experience in detail in Python Training classes

 Overall, 14 Years of experience in the fields of Big Data / BI and GCP  Certified Google Cloud Professional cloud Architect with 3 years of experience  Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage  Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB  Completed Azure DP-900 certification  Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage  Experience in Data Validation automation tool  Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools  Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS  Experience in preparing Test Strategy, Test Plan and Test estimation  Worked in Agile and Waterfall models  Good knowledge in good automation tools  Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code.  Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data.  Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture  Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs  Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows,  Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub.  Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling  Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing.  Experience in optimizing ETL workflows.  Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API.  Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability.  Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts.  Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries.  Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend.  Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS.  Good Experience in working with SerDe’s like Avro Format, Parquet format data.

Microsoft Azure Training

Class Location

Online (video chat via skype, google hangout etc)

Student's home

Tutor's Home

Years of Experience in Microsoft Azure Training

15

Azure Certification offered

Azure Certified Data Engineer

Teaching Experience in detail in Microsoft Azure Training

 Overall, 14 Years of experience in the fields of Big Data / BI and GCP  Certified Google Cloud Professional cloud Architect with 3 years of experience  Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage  Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB  Completed Azure DP-900 certification  Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage  Experience in Data Validation automation tool  Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools  Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS  Experience in preparing Test Strategy, Test Plan and Test estimation  Worked in Agile and Waterfall models  Good knowledge in good automation tools  Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code.  Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data.  Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture  Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs  Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows,  Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub.  Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling  Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing.  Experience in optimizing ETL workflows.  Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API.  Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability.  Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts.  Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries.  Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend.  Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS.  Good Experience in working with SerDe’s like Avro Format, Parquet format data.

Reviews (5)

4.4 out of 5 5 reviews

Ramu J https://p.urbanpro.com/tv-prod/member/photo/1691519-small.jpg Sanjeeva Reddy Nagar
4.4055
Ramu J
S

Hadoop

"As I am taking course from last 1 and half month I have learned so many new things. Like hive,pig,sqoop. I got an idea of all these things. He will be telling us oozie and spark as well. It was very good experience. I learned so many things. He has explained everything very clearly. He has cleared all doubts regularly. "

Ramu J
G

Hadoop

"Good trainer for beginners, trying hard for the students and clarifying doubt then and there, easy to follow his class. "

Ramu J
S

Hadoop

"The training was good. I feel there should be a two days revision so that we get to know all the things "

Ramu J
R

Hadoop

"No one can teach Big Data Concepts like Ramu Sir. He is excellent. I attended many training institutes to learn Hadoop. I got satisfied only with Ramu Sir's teaching. Ramu has great patience. If we don't understand any topic, he gives very good examples to makes us understand. If anyone wants to learn Hadoop, I would confidently say attend Ramu Sir's without any second opinion. "

Have you attended any class with Ramu?

FAQs

1. Which classes do you teach?

I teach Big Data, Google Cloud Platform, Microsoft Azure Training and Python Training Classes.

2. Do you provide a demo class?

Yes, I provide a free demo class.

3. How many years of experience do you have?

I have been teaching for 15 years.

Teaches

Big Data Training
6 Students

Class Location

Online (video chat via skype, google hangout etc)

Student's Home

Tutor's Home

Years of Experience in Big Data Training

15

Big Data Technology

Hadoop

Teaching Experience in detail in Big Data Training

 Overall, 14 Years of experience in the fields of Big Data / BI and GCP  Certified Google Cloud Professional cloud Architect with 3 years of experience  Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage  Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB  Completed Azure DP-900 certification  Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage  Experience in Data Validation automation tool  Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools  Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS  Experience in preparing Test Strategy, Test Plan and Test estimation  Worked in Agile and Waterfall models  Good knowledge in good automation tools  Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code.  Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data.  Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture  Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs  Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows,  Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub.  Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling  Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing.  Experience in optimizing ETL workflows.  Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API.  Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability.  Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts.  Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries.  Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend.  Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS.  Good Experience in working with SerDe’s like Avro Format, Parquet format data.

Google Cloud Platform

Class Location

Online (video chat via skype, google hangout etc)

Student's Home

Tutor's Home

Years of Experience in Google Cloud Platform

15

Teaching Experience in detail in Google Cloud Platform

 Overall, 14 Years of experience in the fields of Big Data / BI and GCP  Certified Google Cloud Professional cloud Architect with 3 years of experience  Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage  Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB  Completed Azure DP-900 certification  Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage  Experience in Data Validation automation tool  Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools  Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS  Experience in preparing Test Strategy, Test Plan and Test estimation  Worked in Agile and Waterfall models  Good knowledge in good automation tools  Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code.  Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data.  Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture  Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs  Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows,  Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub.  Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling  Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing.  Experience in optimizing ETL workflows.  Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API.  Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability.  Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts.  Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries.  Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend.  Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS.  Good Experience in working with SerDe’s like Avro Format, Parquet format data.

Python Training classes

Class Location

Online (video chat via skype, google hangout etc)

Student's Home

Tutor's Home

Years of Experience in Python Training classes

15

Course Duration provided

1-3 months

Seeker background catered to

Educational Institution, Individual, Corporate company

Certification provided

Yes

Python applications taught

PySpark

Teaching Experience in detail in Python Training classes

 Overall, 14 Years of experience in the fields of Big Data / BI and GCP  Certified Google Cloud Professional cloud Architect with 3 years of experience  Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage  Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB  Completed Azure DP-900 certification  Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage  Experience in Data Validation automation tool  Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools  Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS  Experience in preparing Test Strategy, Test Plan and Test estimation  Worked in Agile and Waterfall models  Good knowledge in good automation tools  Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code.  Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data.  Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture  Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs  Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows,  Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub.  Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling  Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing.  Experience in optimizing ETL workflows.  Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API.  Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability.  Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts.  Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries.  Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend.  Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS.  Good Experience in working with SerDe’s like Avro Format, Parquet format data.

Microsoft Azure Training

Class Location

Online (video chat via skype, google hangout etc)

Student's home

Tutor's Home

Years of Experience in Microsoft Azure Training

15

Azure Certification offered

Azure Certified Data Engineer

Teaching Experience in detail in Microsoft Azure Training

 Overall, 14 Years of experience in the fields of Big Data / BI and GCP  Certified Google Cloud Professional cloud Architect with 3 years of experience  Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage  Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB  Completed Azure DP-900 certification  Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage  Experience in Data Validation automation tool  Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools  Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS  Experience in preparing Test Strategy, Test Plan and Test estimation  Worked in Agile and Waterfall models  Good knowledge in good automation tools  Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code.  Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data.  Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture  Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs  Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows,  Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub.  Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling  Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing.  Experience in optimizing ETL workflows.  Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API.  Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability.  Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts.  Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries.  Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend.  Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS.  Good Experience in working with SerDe’s like Avro Format, Parquet format data.

4.4 out of 5 5 reviews

Ramu J
S

Hadoop

"As I am taking course from last 1 and half month I have learned so many new things. Like hive,pig,sqoop. I got an idea of all these things. He will be telling us oozie and spark as well. It was very good experience. I learned so many things. He has explained everything very clearly. He has cleared all doubts regularly. "

Ramu J
G

Hadoop

"Good trainer for beginners, trying hard for the students and clarifying doubt then and there, easy to follow his class. "

Ramu J
S

Hadoop

"The training was good. I feel there should be a two days revision so that we get to know all the things "

Ramu J
R

Hadoop

"No one can teach Big Data Concepts like Ramu Sir. He is excellent. I attended many training institutes to learn Hadoop. I got satisfied only with Ramu Sir's teaching. Ramu has great patience. If we don't understand any topic, he gives very good examples to makes us understand. If anyone wants to learn Hadoop, I would confidently say attend Ramu Sir's without any second opinion. "

Load More
Have you attended any class with Ramu?

Ramu J describes himself as GCP hadoop pyspark dataengineer trainer. He conducts classes in Big Data, Google Cloud Platform and Microsoft Azure Training. Ramu is located in Sanjeeva Reddy Nagar, Hyderabad. Ramu takes at students Home, Regular Classes- at his Home and Online Classes- via online medium. He has 15 years of teaching experience . Ramu has completed Master of Computer Applications (M.C.A.) from dravidian university in 2009. He is well versed in Telugu, Tamil, Hindi and English. Ramu has got 5 reviews till now with 100% positive feedback.

X
X

Post your Learning Need

Let us shortlist and give the best tutors and institutes.

or

Send Enquiry to Ramu

Let Ramu know you are interested in their class

Reply to 's review

Enter your reply*

1500/1500

Please enter your reply

Your reply should contain a minimum of 10 characters

Your reply has been successfully submitted.

This website uses cookies

We use cookies to improve user experience. Choose what cookies you allow us to use. You can read more about our Cookie Policy in our Privacy Policy

Accept All
Decline All

UrbanPro.com is India's largest network of most trusted tutors and institutes. Over 55 lakh students rely on UrbanPro.com, to fulfill their learning requirements across 1,000+ categories. Using UrbanPro.com, parents, and students can compare multiple Tutors and Institutes and choose the one that best suits their requirements. More than 7.5 lakh verified Tutors and Institutes are helping millions of students every day and growing their tutoring business on UrbanPro.com. Whether you are looking for a tutor to learn mathematics, a German language trainer to brush up your German language skills or an institute to upgrade your IT skills, we have got the best selection of Tutors and Training Institutes for you. Read more