Directive to install apache spark, airflow and hadoop
minimal €36 EUR / jam
I need a directive to install apache spark, airflow and hadoop on a linux server with Debian OS. If it helps you can use appropriate containerization like kubernetes Suggestions, questions and comments are welcomed
ID Proyek: #31822689
Tentang proyek
8 freelancer rata-rata menawar €36/jam untuk pekerjaan ini
Greetings , I can configure for you apache spark, airflow and hadoop on a linux server with Debian OS (Ubuntu 18/20) let me know if you would be interested in my services further any specific versions for Apache spar Lebih banyak
Hello, I am based in Germany. I am so interresed in your project. Please send me a message to dissucss more details in private to help you out in your project. Thank you.
Hope you are doing well. I'm a MLOps & DevOps engineer with product proven experiences. I have experiences deploy that kind of data platform on AWS. The kubernetes was used instead of Hadoop Yarn and the benifit was th Lebih banyak
Hi, I just read your job posting and I think that I am fit for this job as I have 4 years of experience building application for small startups to big organizations like VMWare, Western Union and Send Safely and deploy Lebih banyak
We have expertise in Apache, LAMP, Azure and AWS, Azure & GCP Certified Professional Cloud Architect with multi-tasking skills having more over 10 years of extensive experience in AWS, Azure & Google Cloud Architectur Lebih banyak
I am the best person for the job. I have worked with servers for many years. I configure, maintain and develop servers.