We use cookies to provide the essential functionality of the website and its advanced features, to analyse traffic and to improve our services to you, and to provide personalised content to you.
By clicking ‘Accept All’ you agree to these. You can change your selection at any time under the ‘Cookie Settings’ link on the page.
You can read more about our cookies in our Cookie Policy
Cookie Settings
We use cookies to provide the essential functionality of the website and its advanced features, to analyse traffic and to improve our services to you, and to provide personalised content to you.
By clicking ‘Accept’ you agree to these. You can read more about our cookies in our Cookie Policy
These cookies are essential to the functionality of thebigjobsite.com
When you log in to the Internet Site the Company will set a cookie containing a randomly generated unique reference number. This anonymous number allows the Company to identify you. The Company will never store your personal information directly as a cookie. A persistent cookie will be set, persistent cookies are not deleted when you close your browser, and will allow the Internet Site to recognise you on your next visit.
Name
Expiration
Description
ATTBCookie*
2 years
These cookies are used to remember a user’s choice about cookies on thebigjobsite.com. Where users have previously indicated a preference, that user’s preference will be stored in these cookies.
last-search
search
redirect-stage
original-keyword
1 day
Session
1 hour
1 hour
These cookies are used by thebigjobsite.com to pass search data between our own pages.
datadome
1 year
DataDome is a cybersecurity solution to detect bot activity
jjap
1 days
Used to track if you have seen the Job Alerts prompt. Job Alerts is a service you can subscribe to to receive information about new jobs.
Advanced features of the site use Cookies to provide information you requested and to reduce you having to key in repeated fields.
Name
Expiration
Description
attb-loc
3 months
Stores your location information so that we can pre-populate search fields to find jobs near you.
Analytic cookies allow the Company to see how the Internet Site is being used. This information forms the basis of future development work, and so enables the Company to continually improve its Internet Site to best suit its users.
Name
Expiration
Description
__gads
_ga
_ga_JH3TWMTYRK
_gat_gtag_UA_1462011_9
_gcl_au
_gid
_uetsid
_uetvid
13 months
2 years
2 years
1 minute
90 days
24 hours
1 day
16 days
Google Analytics: For purposes of analytics, your UserID may be tracked and sent to Google Analytics after you register for one of our services such as Job Alerts. After you register, your registered session may be stitched together with your original, unauthenticated session. This allows longer-term tracking to help us monitor the effectiveness of our marketing campaigns. No personally identifiable information, or data that permanently identifies your device, is sent along with your tracking IDs.
Create Email Alert
ⓘ There was an unexpected error processing your request.
Please refresh the page and try again.
If the problem persists, please contact us with your issue.
Email address is already registered
You can always manage your preferences and update your interests to ensure you receive the most relevant opportunities.
Would you like to [visit your alert settings] now?
Success! You're now signed up for Job Alerts
Get ready to discover your next great opportunity.
Candidate should be able to:
Work closely with our QA Team to ensure data integrity and overall system quality
Work closely with Technology Leadership, Product Managers, and Reporting Team for understanding the functional and system requirements
Write Shell/Python scripts for jobs scheduling and data wrangling
Write Scoop Jobs to Import/Export data
• Experience with Hadoop and the HDFS Ecosystem
• Strong Experience with Apache Spark, Storm, Kafka is a must.
• Experience with Python, R, Pig, Hive, Kafka, Knox, Tomcat and Ambari
• Experience with MongoDB
• A minimum of 4 years working with HBase/Hive/MRV1/MRV2 is required.
• Experience in integrating heterogeneous applications is required.
• Ex
Admin , Big Data , Data , Description , Hadoop , India , Job , Level , Location , PAN
Experience Required
6 - 12 Years
Industry Type
IT
Employment Type
Permanent
Location
India
Description
Primary Tech skills – AZURE ,Databricks ,Pyspark,SQL Secondary Tech skills – ADF, Synapse Roles:•Experience in Pyspark, SQL, Cloud data warehouse like Azure Synapse, databricks•Experience working with structured and unstructured data•Extensive knowledge of Data Warehousing concepts, strategies, methodologies•Experience in ETL/Pipe
Candidate should be able to:
• This solution enables our Data Scientists and Data Analysts to actively detect trends, patterns, and relationships in data sets.
•engage in the enhancement and maintenance support of data movement solutions to meet client business objectives and will have an opportunity to work with a cross-disciplinary team.
•Contrib
Candidate should have
•Superior oral, and written communication skills, as well as the willingness to collaborate across teams of internal and external technical staff, business analysts, software support, and operations staff.
•Ability to write abstracted, reusable code components
•Hands-on experience in writing shell scripts, complex SQL queries,
• Hands-on experience in any one of the programming languages – Python (preferred), Java, Scala, and non-procedural language such as SQL.
• Experience with data science and machine learning is a plus.
• Experience working with large data sets in distributed computing to perform Massive parallel processing, which Contributes to building an analytics
Description
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects th
Job Source: Epam
Hadoop Big data engineer
chennai
• As an individual contributor, design modules, apply creative problem-solving using tools and technologies.
• As individual contributor code and test modules.
• Interact and collaborate directly with software developers, product managers, and business analysts to ensure proper development and quality of service applications and products.
• Ability to do development in an Agile environment.
• Work closely with Leads and Architects to understand the requirements and translate that into code.