skip to main content

Important Notice

It appears you are using an older version of your browser. While some functions will be available, Delaware JobLink works best with a modern browser such as the ones provided by:

Please download and install the latest version of the browser of your choice. We apologize for any inconvenience.



Java/Spark Spark MapReduce Engineer

Click the Facebook, Google+ or LinkedIn icons to share this job with your friends or contacts. Click the Twitter icon to tweet this job to your followers. Click the link button to view the URL of the job, which then can be copied and pasted into an e-mail or other document.

Job Details
Job Order Number
JC158819805
Company Name
Kforce
Physical Address

Wilmington, DE 19893
Job Description

Kforce has a client, one of the world’s largest financial institutions is seeking to hire a team of Java/Scala Spark MapReduce Engineers in building a strategic, analytic Data Ecosystem in Wilmington, Delaware (DE) and Plano, Texas (TX) using Hadoop.Responsibilities:

  • Engage with business & multiple IT teams to understand new solutions requiring data publication and/or new data structures, understand how the business functionality maps to the source system data
  • Create optimal functional designs from the high level designs, intersecting business requirements, business and source system data knowledge, platform and facilities, best practices, and regulatory and security controls requirements
  • Create technical designs from data publication driven functional specifications and develop and implement the solutions based on this design; This may include designing, creating, testing and migrating new components into a number of platforms or facilities such as Teradata/Informatica, Hadoop/Ab Initio, Cloud, Java/Spark or Kafka
  • Utilize best practices across all aspects of the SDLC cycle to deliver high quality, lasting ETL solutions that meet the needs of our clients and are consistent with long-term DECO, controls and business strategies
  • College degree or specialized training / equivalent work experience
  • 3 years as a developer in a data warehousing or other data oriented, batch processing environment
  • 3 years using SQL – creating complex functional designs including data mapping aligning with organizational and architectural goals and strategy
  • 3 years of hands on experience with Java or Scala and 2 years of Spark
  • Must have experience with Rest, Spring, Relational Databases, Unix Shell Scripting, Scheduling tools-Control-M is preferred
  • Solid understanding/experience with consumption of Hadoop
  • Pluses are experience with Tableau/BI Reporting tools, AbInitio/Informatica-ETL Tools,
  • Experience with issue analysis and resolution including usage of issue resolution processes for application production problems
  • Experience with release management and installations including preparing for and overseeing the installations
  • Experience working in an Agile setting
  • Demonstrated ability to work in a team environment with a structured SDLC and interface and coordinate with a variety of business and IT groups

Kforce is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.Compensation Type:Hours*Minimum Compensation:0.00Maximum Compensation:*0.00


To view full details and how to apply, please login or create a Job Seeker account.