Previous Next
 
"Apache Hive Interview Question and Answer (100+ FAQ)"
"Apache Hive Interview Questions has a collection of 100+ questions with answers asked in the interview for freshers and experienced (Programming, Scenario-Based, Fundamentals, Performance Tuning based Question and Answer). This course is intended to help Apache Hive Career Aspirants to prepare for the interview.We are planning to add more questions in upcoming versions of this course. The Apache Hive data warehouse software facilitates reading, writing, and managing large datasets residing in distributed storage using SQL. Structure can be projected onto data already in storage. A command line tool and JDBC driver are provided to connect users to Hive.Course Consist of the Interview Question on the following TopicsHive TutorialHive SQL Language Manual: Commands, CLIs, Data Types,DDL (create/drop/alter/truncate/show/describe), Statistics (analyze), Indexes, Archiving,DML (load/insert/update/delete/merge, import/export, explain plan),Queries (select), Operators and UDFs, Locks, AuthorizationFile Formats and Compression: RCFile, Avro, ORC, Parquet; Compression, LZOProcedural Language: Hive HPL/SQLHive Configuration PropertiesHive ClientsHive Client (JDBC, ODBC, Thrift)HiveServer2: Overview, HiveServer2 Client and Beeline, Hive MetricsHive Web InterfaceHive SerDes: Avro SerDe, Parquet SerDe, CSV SerDe, JSON SerDeHive Counters"
Price: 19.99


"Docker Interview Question and Answer (100+ FAQ)"
"Docker Interview Questions has a collection of 100+ questions with answers asked in the interview for freshers and experienced (Programming, Scenario-Based, Fundamentals, Performance Tuning based Question and Answer).This course is intended to help Docker Career Aspirants to prepare for the interview. We are planning to add more questions in upcoming versions of this course.Docker is a set of platform as a service products that use OS-level virtualization to deliver software in packages called containers. Containers are isolated from one another and bundle their own software, libraries and configuration files; they can communicate with each other through well-defined channels.Course Consist of the Interview Question on the following TopicsThe Docker platformDocker EngineDocker architectureThe Docker daemonThe Docker clientDocker registriesDocker objectsImages ContainersNamespacesControl groupsUnion file systemsContainer format"
Price: 19.99


"Apache Kafka Interview Question and Answer(100+ FAQ)"
"Apache Kafka Interview Questions has a collection of 100+ questions with answers asked in the interview for freshers and experienced (Programming, Scenario-Based, Fundamentals, Performance Tuning based Question and Answer).This course is intended to help Apache Kafka Career Aspirants to prepare for the interview. We are planning to add more questions in upcoming versions of this course.Apache Kafka is an open-source stream-processing software platform developed by LinkedIn and donated to the Apache Software Foundation, written in Scala and Java. The project aims to provide a unified, high-throughput, low-latency platform for handling real-time data feeds.Course Consist of the Interview Question on the following Topics1. Kafka Core2. Kafka APIS3. CONFIGURATION of Kafka4. DESIGN of Kafka5. IMPLEMENTATION of Kafka6. OPERATIONS of Kafka7. SECURITY of Kafka8. KAFKA CONNECT9. KAFKA STREAMS"
Price: 19.99


"Olympic Games Analytics Project in Apache Spark for beginner"
"In this course you will learn to Analyze data (Olympic Game) in Apache Spark using Databricks Notebook (Community edition), 1) Basics flow of data in Apache Spark, loading data, and working with data, this course shows you how Apache Spark is perfect for Big Data Analysis job. 2) Learn basics of Databricks notebook by enrolling into Free Community Edition Server 3) Olympic Games Analytics a real world examples. 4) Graphical  Representation of Data using Databricks notebook.5) Hands-on learning6) Real-time Use Case7) Publish the Project on Web to Impress your recruiter About Databricks: Databricks lets you start writing Spark queries instantly so you can focus on your data problems.Lets discover more about the Olympic Games using Apache SparkData:Data exploration about the recent history of the Olympic GamesWe will explore a dataset on the modern Olympic Games, including all the Games from Athens 1896 to Rio 2016."
Price: 19.99


"Apache Pig Interview Questions and Answers"
"Apache Pig Interview Questions has a collection of 50+ questions with answers asked in the interview for freshers and experienced (Programming, Scenario-Based, Fundamentals, Performance Tuning based Question and Answer).This  course is intended to help Apache Pig Career Aspirants to prepare for the interview. We are planning to add more questions in upcoming versions of this course.Apache Pig is a platform for analyzing large data sets that consists of a high-level language for expressing data analysis programs, coupled with infrastructure for evaluating these programs. The salient property of Pig programs is that their structure is amenable to substantial parallelization, which in turns enables them to handle very large data sets.Course Consist of the Interview Question on the following TopicsPig CorePig Latin Built In FunctionsUser Defined FunctionsControl StructuresShell and Utililty CommandsPerformance and EfficiencyTesting and DiagnosticsVisual EditorsAdministrationIndexMiscellaneous"
Price: 19.99


"Apache Spark Project World Development Indicators Analytics"
"In this Apache Spark course you will learn to Analyze data (World Bank Dataset) in Apache Spark using Databricks Notebook (Community edition), 1) Basics flow of data in Apache Spark, loading data, and working with data, this course shows you how Apache Spark is perfect for Big Data Analysis job. 2) Learn basics of Databricks notebook by enrolling into Free Community Edition Server 3) World Development Indicators Analytics Project a real world examples. 4) Graphical  Representation of Data using Databricks notebook.5) Publish the Project on Web to Impress your recruiter 6) Hands-on learningAbout Databricks: Databricks lets you start writing Spark queries instantly so you can focus on your data problems.Lets discover more about the World Development Indicators Analytics Project using Apache SparkData:The World Development Indicators from the World Bank contain over a thousand annual indicators of economic development from hundreds of countries around the world."
Price: 19.99


"Learn Apache Spark to Generate Weblog Reports for Websites"
"Apache Spark is a flexible and fast framework designed for managing huge volumes of data. The engine supports the use of multiple programming languages, including Python, Scala, Java, and R. Therefore, before starting to learn Apache Spark use, you might want to focus on one of these languages.In this Apache Spark tutorial, we will be focusing on the eCommerce weblog report generation. For companies that are highly dependent on their web presence and popularity, it is crucial to determine the factors that might be related to a successful eCommerce strategy. As a result, some business-owners consider analyzing weblogs. During Apache Spark training, you will be introduced with a variety of reports that you can generate from these weblogs.What is Apache Spark?To learn Apache Spark, you need to be introduced to the basic principles of this engine. First of all, it is a framework for improving speed, simplicity of use, and streaming analytics spread by Apache. Apache Spark is an extremely efficient tool for performing data processing analysis.What are weblogs?A weblog can provide you with insightful information about how your visitors act on your website. By definition, weblog records the actions of users. They might be useful when aiming to determine which parts of your website attract the most attention. Logs can reveal how people found your website (for instance, search engines) and which keywords they used for searches.What will you find in this course?In this course for people that have chosen to learn Apache Spark, we will be focusing on a practical project to improve your skills. There will be some basics of how to use Spark, but you are expected to have a decent understanding of the way it works.For our project, you will have to download several files: they are a must for this Spark tutorial. Then, we will start by exploring file-level details and the process of creating a free account in DataBricks.The aim of the project in this course to learn Apache Spark is to review all of the possible reports that you can conduct from the weblogs. We will be retrieving critical information from the log files. For this purpose, we will use the DataBricks Notebook. As a brief reminder: DataBricks allows you to write Spark queries instantly without having to focus on data problems. It is considered as one of the programs to help you manage and organize data.We will learn how to use Spark to generate various types of reports. For instance, a session report provides information about the session activity, referring to the actions that a user with a unique IP performs during a specified period. The number of user sessions determines the amount of traffic that websites receive.This Apache Spark training course will also focus on a pageview report, which determines how many pages were viewed during a specified time. Additionally, you will learn about a new visitor report, indicating the number of new users that have visited the website during a given time.To learn Apache Spark better, you will be introduced with referring domains report, target domains report, top IP addresses report, search query report, and more!In this course, you will learn to create Weblog Report Generation for Ecommerce website log in Apache Spark using Databricks Notebook (Community edition), 1) Basics flow of data in Apache Spark, loading data, and working with data, this course shows you how Apache Spark is perfect for Big Data Reporting Engine. 2) Learn the basics of Databricks notebook by enrolling into Free Community Edition Server 3) Ecommerce Weblog Tracking Report generation Project real-world example. 4) Graphical  Representation of Data using Databricks notebook.5) Create a Data Pipeline6) Launching Spark Cluster7) Process that data using Apache Spark8) Publish the Project on Web to Impress your recruiter About Databricks: Databricks lets you start writing Spark queries instantly so you can focus on your data problems.Let's discover more about the Ecommerce Weblog Tracking Report generation Project using Apache SparkData:The data is Weblog or Website log of Ecommerce Server (Unreal Data for Training Purpose)"
Price: 19.99


"Apache Hadoop and Mapreduce Interview Questions and Answers"
"Apache Hadoop and Mapreduce Interview Questions has a collection of 120+ questions with answers asked in the interview for freshers and experienced (Programming, Scenario-Based, Fundamentals, Performance Tuning based Question and Answer).This  course is intended to help Apache Hadoop and Mapreduce Career Aspirants to prepare for the interview. We are planning to add more questions in upcoming versions of this course.The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-availability, the library itself is designed to detect and handle failures at the application layer, so delivering a highly-available service on top of a cluster of computers, each of which may be prone to failures.Hadoop MapReduce is a software framework for easily writing applications which process vast amounts of data (multi-terabyte data-sets) in-parallel on large clusters (thousands of nodes) of commodity hardware in a reliable, fault-tolerant manner.A MapReduce job usually splits the input data-set into independent chunks which are processed by the map tasks in a completely parallel manner. The framework sorts the outputs of the maps, which are then input to the reduce tasks. Typically both the input and the output of the job are stored in a file-system. The framework takes care of scheduling tasks, monitoring them and re-executes the failed tasks.Typically the compute nodes and the storage nodes are the same, that is, the MapReduce framework and the Hadoop Distributed File System (see HDFS Architecture Guide) are running on the same set of nodes. This configuration allows the framework to effectively schedule tasks on the nodes where data is already present, resulting in very high aggregate bandwidth across the cluster.Course Consist of the Interview Question on the following TopicsSingle Node SetupCluster SetupCommands ReferenceFileSystem ShellCompatibility SpecificationInterface ClassificationFileSystem SpecificationCommonCLI Mini ClusterNative LibrariesHDFSArchitectureCommands ReferenceNameNode HA With QJMNameNode HA With NFSFederationViewFsSnapshotsEdits ViewerImage ViewerPermissions and HDFSQuotas and HDFSDisk BalancerUpgrade DomainDataNode AdminRouter FederationProvided StorageMapReduceDistributed Cache DeploySupport for YARN Shared CacheMapReduce REST APIsMR Application MasterMR History ServerYARNArchitectureCommands ReferenceResourceManager RestartResourceManager HANode LabelsNode AttributesWeb Application ProxyTimeline ServerTimeline Service V.2Writing YARN ApplicationsYARN Application SecurityNodeManagerUsing CGroupsYARN FederationShared CacheYARN UI2YARN REST APIsIntroductionResource ManagerNode ManagerTimeline ServerTimeline Service V.2YARN ServiceYarn Service APIHadoop StreamingHadoop ArchivesHadoop Archive LogsDistCpHadoop BenchmarkingReferenceChangelog and Release NotesConfigurationcore-default.xmlhdfs-default.xmlhdfs-rbf-default.xmlmapred-default.xmlyarn-default.xmlDeprecated Properties"
Price: 19.99


"Predictive Analytics with Apache Spark including Project"
"This course attempts to explain the basic concepts of the exponential family of predict modeling. You shall learn about the different components of this family and their relationship. Also, learn about how to identify the right model fitment to the given data series.One of the most common mistakes an analyst could make during a predictive modeling project is ignorance of sample bias in their data. The cost of making such a mistake can be quite substantial to an organization's business outcome.In this video course, we will share with you some secrets on how to avoid this mistake. You will learn the following topics:1) Classification Model2) Regression ModelLearn the fundamentals of predictive analysis through an easy to understand the conceptual course.At the end of the class, you should have gained sufficient knowledge to help you detect and reduce sample bias in future predictive modeling or advanced analytics projects.Predictive Analytics with Apache Spark using Databricks (Unofficial)  Notebook (Community edition)  including ProjectExplore Apache Spark and Machine Learning on the Databricks platform.Launching Spark ClusterCreate a Data PipelineProcess that data using a Machine Learning model (Spark ML Library)Hands-on learning using the example (Classification and Regression)Real-time Use CasePublish the Project on Web to Impress your recruiter Graphical Representation of Data using Databricks notebook.Transform structured data using SparkSQL and DataFramesAbout Databricks: Databricks lets you start writing Spark ML code instantly so you can focus on your data problems."
Price: 19.99


"Employee Attrition Prediction in Apache Spark (ML) Project"
"Spark Machine Learning Project (Employee Attrition Prediction) for beginners using Databricks Notebook (Unofficial) (Community edition Server) In this Data science Machine Learning project, we will create Employee Attrition Prediction Project using Decision Tree Classification algorithm one of the predictive models.Explore Apache Spark and Machine Learning on the Databricks platform.Launching Spark ClusterCreate a Data PipelineProcess that data using a Machine Learning model (Spark ML Library)Hands-on learningReal time Use Case Publish the Project on Web to Impress your recruiter Graphical Representation of Data using Databricks notebook.Transform structured data using SparkSQL and DataFramesEmployee Attrition Prediction a Real time Use Case on Apache SparkAbout Databricks: Databricks lets you start writing Spark ML code instantly so you can focus on your data problems."
Price: 19.99


"Spark Machine Learning Project (House Sale Price Prediction)"
"Spark Machine Learning Project (House Sale Price Prediction) for beginners using Databricks Notebook (Unofficial) (Community edition Server) In this Data science Machine Learning project, we will predict the sales prices in the Housing data set using LinearRegression one of the predictive models.Explore Apache Spark and Machine Learning on the Databricks platform.Launching Spark ClusterCreate a Data PipelineProcess that data using a Machine Learning model (Spark ML Library)Hands-on learningReal time Use Case Publish the Project on Web to Impress your recruiter Graphical Representation of Data using Databricks notebook.Transform structured data using SparkSQL and DataFramesPredict sales prices a Real time Use Case on Apache SparkAbout Databricks: Databricks lets you start writing Spark ML code instantly so you can focus on your data problems."
Price: 19.99


"Telecom Customer Churn Prediction in Apache Spark (ML)"
"Apache Spark Started as a research project at the University of California in 2009, Apache Spark is currently one of the most widely used analytics engines. No wonder: it can process data on an enormous scale, supports multiple coding languages (you can use Java, Scala, Python, R, and SQL) and runs on its own or in the cloud, as well as on other systems (e.g., Hadoop or Kubernetes).In this Apache Spark tutorial, I will introduce you to one of the most notable use cases of Apache Spark: machine learning. In less than two hours, we will go through every step of a machine learning project that will provide us with an accurate telecom customer churn prediction in the end. This is going to be a fully hands-on experience, so roll up your sleeves and prepare to give it your best!First and foremost, how does Apache Spark machine learning work?Before you learn Apache Spark, you need to know it comes with a few inbuilt libraries. One of them is called MLlib. To put it simply, it allows the Spark Core to perform machine learning tasks and (as you will see in this Apache Spark tutorial) does it in breathtaking speed. Due to its ability to handle significant amounts of data, Apache Spark is perfect for tasks related to machine learning, as it can ensure more accurate results when training algorithms.Mastering Apache Spark machine learning can also be a skill highly sought after by employers and headhunters: more and more companies get interested in applying machine learning solutions for business analytics, security, or customer service. Hence, this practical Apache Spark tutorial can become your first step towards a lucrative career!Learn Apache Spark by creating a project from A to Z yourself!I am a firm believer that the best way to learn is by doing. Thats why I havent included any purely theoretical lectures in this Apache Spark tutorial: you will learn everything on the way and be able to put it into practice straight away. Seeing the way each feature works will help you learn Apache Spark machine learning thoroughly by heart.I will also be providing some materials in ZIP archives. Make sure to download them at the beginning of the course, as you will not be able to continue with the project without it.And thats not all youre getting from this course can you believe it?Apart from Spark itself, I will also introduce you to Databricks a platform that simplifies handling and organizing data for Spark. Its been founded by the same team that initially started Spark, too. In this course, I will explain how to create an account on Databricks and use its Notebook feature for writing and organizing your code.After you finish my Apache Spark tutorial, you will have a fully functioning telecom customer churn prediction project. Take the course now, and have a much stronger grasp of machine learning and data analytics in just a few hours!Spark Machine Learning Project (Telecom Customer Churn Prediction) for beginners using Databricks Notebook (Unofficial) (Community edition Server) In this Data Science Machine Learning project, we will create Telecom Customer Churn Prediction Project using Classification Model Logistic Regression, Naive Bayes and One-vs-Rest classifier few of the predictive models.Explore Apache Spark and Machine Learning on the Databricks platform.Launching Spark ClusterCreate a Data PipelineProcess that data using a Machine Learning model (Spark ML Library)Hands-on learningReal time Use Case Publish the Project on Web to Impress your recruiter Graphical  Representation of Data using Databricks notebook.Transform structured data using SparkSQL and DataFramesTelecom Customer Churn Prediction a Real time Use Case on Apache SparkAbout Databricks: Databricks lets you start writing Spark ML code instantly so you can focus on your data problems."
Price: 19.99


"Heart Attack and Diabetes Prediction Project in Apache Spark"
"Apache Spark Project - Heart Attack and Diabetes Prediction Project in Apache Spark Machine Learning Project (2 mini-projects) for beginners using Databricks Notebook (Unofficial) (Community edition Server) In this Data science Machine Learning project, we will create 1) Heart Disease Prediction 2) Diabetes Predictionusing a few algorithms of the predictive models.Explore Apache Spark and Machine Learning on the Databricks platform.Launching Spark ClusterProcess that data using a Machine Learning model (Spark ML Library)Hands-on learningReal time Use Case Create a Data PipelinePublish the Project on Web to Impress your recruiter Graphical  Representation of Data using Databricks notebook.Transform structured data using SparkSQL and DataFramesData exploration using Apache Spark1) Heart Disease Prediction using Decision Tree Classification Model2) Diabetes Prediction using Logistic Regression Model and One-vs-Rest classifier (a.k.a. One-vs-All) Model A Real time Use Case on Apache SparkAbout Databricks: Databricks lets you start writing Spark ML code instantly so you can focus on your data problems."
Price: 19.99


"Build Apache Spark Analytics Project using Web Server Log"
"Welcome to our course. Looking to learn Apache Spark Analytics end-to-end projects and take it to a job interview? You have come to the RIGHT course!This course teaches you Apache Spark 3.0 with Scala, building Spark Analytics programs, and helps you practice hands-on with an end-to-end real-life application project. Our goal is to help you and everyone learn, so we keep our prices low and affordable.Apache Spark is the hottest Big Data skill today. More and more organizations are adapting Apache Spark for building their big data processing and analytics applications and the demand for Apache Spark professionals is sky rocketing. Learning Apache Spark is a great vehicle to good jobs, better quality of work and the best remuneration packages.The goal of this project is provide hands-on training that applies directly to real-world Big Data projects. It uses the learn-train-practice-apply methodology where youLearn solid fundamentals of the domainPractice hands-on and validate it with solutions providedApply the knowledge you acquired in an end-to-end real-life projectTaught by an expert in the field, you will also get a prompt response to your queries and excellent support from Udemy.In this course, you will learn to Analyze data (Apache Web Server log) in Apache Spark using Databricks Notebook (Community edition), 1) Basics flow of data in Apache Spark, loading data, and working with data, this course shows you how Apache Spark is perfect for Big Data Analysis job. 2) Data exploration about Apache Web Server Log using Apache Spark3) Learn the basics of Databricks notebook by enrolling into Free Community Edition Server 4) Apache Web Server logs Analytics a real-world example. 5) Graphical  Representation of Data using Databricks notebook.6) Transform structured data using SparkSQL and DataFrames7) Launching Spark Cluster8) Hands-on learning9) Real-time Use Case10) Publish the Project on Web to Impress your recruiter About Databricks: Databricks lets you start writing Spark queries instantly so you can focus on your data problems.Let's discover more about the Apache Web Server log Report generation Project for beginners using Apache SparkData:Data exploration about the recent history of the Apache Web Server log."
Price: 19.99


"Apache Spark MCQ Practice Test useful for Certification"
"Apache Spark Multiple Choice Question Practice Test for Certification (Unofficial)  Course is designed for Apache Spark Certification Enthusiast"" This is an Unofficial course and this course is not affiliated, licensed or trademarked with Any Spark Certification in any way.""Learn the latest Big Data Technology - Spark! And learn to use it with one of the most popular programming languages, Scala!One of the most valuable technology skills is the ability to analyze huge data sets, and this course is specifically designed to bring you up to speed on one of the best technologies for this task, Apache Spark! The top technology companies like Google, Facebook, Netflix, Airbnb, Amazon, NASA, and more are all using Spark to solve their big data problems!Spark can perform up to 100x faster than Hadoop MapReduce, which has caused an explosion in demand for this skill! Because the Spark 3.0 DataFrame framework is so new, you now have the ability to quickly become one of the most knowledgeable people in the job market!Useful for CRT020: Databricks Certified Associate Developer for Apache Spark 2.4 with Scala 2.11 Assessment This course offers you practice tests comprising of Most Expected Questions for Exam practice, that mimics the actual certification exam, which will help you get prepared for the main exam environment.It will help you prepare for certification by providing sample questions.This will boost your confidence to appear for certification and also provides you with sample scenarios so that you are well equipped before appearing for the exam.Please Note: These questions are only for practice and understanding level of knowledge only. It is not necessary that these questions may or may not appear for examinations and / or interview questions."
Price: 19.99


"Build Apache Spark Machine Learning Project (Banking Domain)"
"Predicting Customer Response to Bank Direct Telemarketing Campaign Project in Apache Spark Project (Machine Learning) for a beginner using Databricks Notebook (Unofficial)Why should you learn Apache Spark Machine Learning Project?Apache Spark Machine Learning is becoming incredibly popular, and with good reason. According to IBM, Ninety percent of the data in the world today has been created in the last two years alone. Our current output of data is roughly 2.5 quintillion bytes per day. The world is being immersed in data, more so each and every day. Machine learning is about learning from existing data to make predictions. Its based on creating models from input datasets for data-driven decision making.Data science is the discipline of extracting the knowledge from large datasets (structured or unstructured) to provide insights to business teams and influence business strategies and roadmaps. The role of a data scientist is more critical than ever in solving problems that are not easy to solve using traditional numerical methods.Project Details:Telemarketing advertising campaigns are a billion-dollar effort and one of the central uses of the machine learning model. However, its data and methods are usually kept under lock and key. The Project is related to the direct marketing campaigns of a banking institution. The marketing campaigns were based on phone calls. Often, more than one contact to the same client was required, in order to access if the product (bank term deposit) would be ('yes') or not ('no') subscribed.In this Data Science Machine Learning project, we will create Predicting Customer Response to Bank Direct Telemarketing Campaign Project in Apache Spark Project (Machine Learning) using Classification Model, Logistic Regression, few of the predictive models.Explore Apache Spark and Machine Learning on the Databricks platform.Launching Spark ClusterCreate a Data PipelineA process that data using a Machine Learning model (Spark ML Library)Hands-on learningReal-time Use Case Publish the Project on Web to Impress your recruiter Predicting Customer Response to Bank Direct Telemarketing Campaign Project a Real-time Use Case on Apache SparkAbout Databricks: Databricks lets you start writing Spark ML code instantly so you can focus on your data problems."
Price: 19.99


"Spark Project (Prediction Online Shopper Purchase Intention)"
"Real-time Prediction of online shoppers purchasing intention Project using Apache Spark Machine Learning Models a Data Pipeline Creation.What is this course about? This course covers all the fundamentals about Apache Spark Machine Learning Project with Scala and teaches you everything you need to know about developing Spark Machine Learning applications using Scala, the Machine Learning Library API for Spark. At the end of this course, you will gain in-depth knowledge about Spark Machine Learning and general big data manipulation skills to help your company to adapt Spark Machine Learning for building Machine Learning Model processing pipelines and data analytics applications. This course will be absolutely critical to anyone trying to make it in data science today. Project Details:Once a user logs into an online shopping website, knowing whether the person will make a purchase or not holds a massive economical value. A lot of current research is focused on real-time revenue predictors for these shopping websites. In this article, we will start building a revenue predictor for one such website. In this Data Science Machine Learning project, we will create a Real-time prediction of online shoppers purchasing intention Project using Apache Spark Machine Learning Models using Logistic Regression, one of the predictive models a data pipeline projectImplementing Apache Spark and Machine Learning on the Databricks platform. Creating a Spark ClusterMake a Data Pipeline A cycle that information utilizing a Machine Learning model (Spark ML Library)Hands-on learning Ongoing Use Case Distribute the Project on Web to Impress Employer.Prediction of Online Shoppers Purchasing Intention Project a Real-time Use Case on Apache SparkAbout Databricks: Databricks lets you start writing Spark ML code instantly so you can focus on your data problems."
Price: 19.99


"Apache Zeppelin - Big Data Visualization Tool"
"Apache Zeppelin - Big Data Visualization Tool for Big data Engineers An Open Source Tool (Free Source for Data Visualization)Learn the latest Big Data Technology - Apache Zeppelin! And learn to use it with one of the most popular programming Data Visualization Tool!One of the most valuable technology skills is the ability to analyze huge data sets, and this course is specifically designed to bring you up to speed on one of the best technologies for this task, Apache Zeppelin! The top technology companies like Google, Facebook, Netflix, Airbnb, Amazon, NASA, and more are all using Apache Zeppelin to solve their big data problems!Master Bigdata Visualization with Apache Zeppelin.Various types of Interpreters to integrate with various big data ecosystemApache Zeppelin provides a web-based notebook along with 20 plus Interpreters to interact with and facilitates collaboration from a WebUI. Zeppelin supports Data Ingestion, Data Discovery, Data Analysis, and Data Visualization.Using an integration of Interpreters is very simple and seamless. Resultant data can be exported or stored in various sources or can be explored with various visualization and can be analyzed with pivot graph like the setupThis course introduces every aspect of visualization, from story to numbers, to architecture, to code. Tell your story with charts on the web. Visualization always reflects the reality of the data.We will Learn: Data Ingestion in Zeppelin environmentConfiguring InterpreterHow to Use Zeppelin to Process Data in Spark Scala, Python, SQL and MySQLData DiscoveryData Analytics in ZeppelinData VisualizationPivot ChartDynamic FormsVarious types of Interpreters to integrate with a various big data ecosystemVisualization of results from big data"
Price: 19.99


"Build Apache Spark Machine Learning Project for eCommerce"
"Build Apache Spark Machine Learning Project for an eCommerce Company that sells clothes online (Revenue Prediction Model)We will learn the most important aspect of Spark Machine learning (Spark MLlib):Apache Spark fundamentals and implementing spark machine learningImporting and Working with DatasetsProcess data using a Machine Learning model using spark MLlibBuild and train a Linear regression modelTest and analyze the modelLearn the latest Big Data Technology - Spark! And learn to use it with one of the most popular programming languages, Scala!One of the most valuable technology skills is the ability to analyze huge data sets, and this course is specifically designed to bring you up to speed on one of the best technologies for this task, Apache Spark! The top technology companies like Google, Facebook, Netflix, Airbnb, Amazon, NASA, and more are all using Spark to solve their big data problems!Spark can perform up to 100x faster than Hadoop MapReduce, which has caused an explosion in demand for this skill! Because the Spark 3.0 DataFrame framework is so new, you now have the ability to quickly become one of the most knowledgeable people in the job market!Project Details: This project is about an eCommerce company that sells clothes online. This project is about customers who buy clothes online. The store offers in-store style and clothing advice sessions. Customers come into the store, have sessions/meetings with a personal stylist, then they can go home and order either on a mobile app or website for the clothes they want.We need to predict the future spending of Customer(ie Revenue for Company ) so business strategies can be made to convert ""Customer"" to ""Loyalty Customer"" In this Data Science Machine Learning project, we will create an eCommerce Customer Revenue Prediction Project using Apache Spark Machine Learning Models using Linear Regression, one of the predictive models.Explore Apache Spark and Machine Learning on the Databricks platform.Launching Spark ClusterCreate a Data PipelineA process that data using a Machine Learning model (Spark ML Library)Hands-on learningReal-time Use CasePublish the Project on Web to Impress your recruiter eCommerce Customer Revenue Prediction Project a Real-time Use Case on Apache SparkAbout Databricks: Databricks lets you start writing Spark ML code instantly so you can focus on your data problems."
Price: 19.99


"Jupyter Notebook - Big Data Visualization Tool"
"Jupyter Notebook - Big Data Visualization Tool for Big data Engineers and Data Scientist for an Open Source Tool (Free Source for Data Visualization)Learn the latest Big Data Technology  Tool- Jupyter Notebook! And learn to use it with one of the most popular programming languages, Scala, Python, Julia, R, Ruby, and many more!One of the most valuable technology skills is the ability to analyze huge data sets, and this course is specifically designed to bring you up to speed on one of the best technologies for this task, Jupyter Notebook! The top technology companies like Google, Facebook, Netflix, Airbnb, Amazon, NASA, and more are all using Spark to solve their big data problems!Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live code, equations, visualizations and narrative text. Uses include: data cleaning and transformation, numerical simulation, statistical modeling, data visualization, machine learning, and much more.JupyterLab is a web-based interactive development environment for Jupyter notebooks, code, and data. JupyterLab is flexible: configure and arrange the user interface to support a wide range of workflows in data science, scientific computing, and machine learning. JupyterLab is extensible and modular: write plugins that add new components and integrate with existing ones.Master Bigdata Visualization with Jupyter Notebook.Jupyter Notebook provides a web-based notebook you can use support following languages:PythonJulia R Ruby Haskell Scala node.js Go We will Learn: Data Ingestion in Jupyter environmentHow to Use Jupyter to process Data in Python, Scala, Julia, R and SwiftData DiscoveryData Analytics in JupyterData VisualizationJupyterLabJupyter NotebookHow to use JupyterLabHow to use Jupyter Notebook"
Price: 19.99


"Delta Lake with Apache Spark using Scala"
"You will Learn Delta Lake with Apache Spark using Scala on DataBricks PlatformLearn the latest Big Data Technology - Spark! And learn to use it with one of the most popular programming languages, Scala!One of the most valuable technology skills is the ability to analyze huge data sets, and this course is specifically designed to bring you up to speed on one of the best technologies for this task, Apache Spark! The top technology companies like Google, Facebook, Netflix, Airbnb, Amazon, NASA, and more are all using Spark to solve their big data problems!Spark can perform up to 100x faster than Hadoop MapReduce, which has caused an explosion in demand for this skill! Because the Spark 3.0 DataFrame framework is so new, you now have the ability to quickly become one of the most knowledgeable people in the job market!Delta Lake is an open-source storage layer that brings reliability to data lakes. Delta Lake provides ACID transactions, scalable metadata handling, and unifies streaming and batch data processing. Delta Lake runs on top of your existing data lake and is fully compatible with Apache Spark APIs.Apache Spark is a fast and general-purpose cluster computing system. It provides high-level APIs in Java, Scala, Python and R, and an optimized engine that supports general execution graphs. It also supports a rich set of higher-level tools including Spark SQL for SQL and structured data processing, MLlib for machine learning, GraphX for graph processing, and Spark Streaming.Topics Included in the CoursesIntroduction to Delta LakeIntroduction to Data LakeKey Features of Delta LakeIntroduction to SparkFree Account creation in DatabricksProvisioning a Spark ClusterBasics about notebooksDataframesCreate a tableWrite a tableRead a tableSchema validationUpdate table schemaTable MetadataDelete from a tableUpdate a TableVacuumHistoryConcurrency ControlOptimistic concurrency controlMigrate Workloads to Delta LakeOptimize Performance with File ManagementAuto OptimizeOptimize Performance with CachingDelta and Apache Spark cachingCache a subset of the dataIsolation LevelsBest PracticesFrequently Asked Question in Interview About Databricks: Databricks lets you start writing Spark code instantly so you can focus on your data problems."
Price: 19.99


"Amazon Seller Central - Selling On Amazon"
"In this Master Class course you will learn what all is needed to set up your own Amazon Seller Central account. We will be going over some account overview  where you will be able to determine what are good product listings as well as what are bad product listings, inventory basics, what is the Amazon Flywheel and Amazon's 5 Pillars for success as well as CRO basics with Brice McBeth."
Price: 19.99


"Sports injury rehabilitation"
"This course shows the process the body goes through following an injury and the process of rehabilitation.  It begins with immediate first aid and follows the general process of rehab.  This includes range of motion, strengthening, including sport specific, proprioception and whole body training, including cardio training, flexibility and mental training.  The course is general in the process but does give injury specific examples of types of exercises in each area."
Price: 24.99


"The basics of concussions"
"This course will address what concussions are, how to identify them with both symptoms and the SCAT.  Students will learn what to do once you have a concussion and the recovery process with both return to learn and play.  Post concussive concerns will also be addressed and what steps you can take to help. "
Price: 24.99


"Python & Whatsapp Hacking & Batan sona python uygulamalari!"
"-Bu kurs gerek python dilinin temelini gerekse de python n ileri seviyede kullanlacak gerekli modullerini  ileri seviyede ve  gerek hayatta nelerin yapldn rahat ve elenceli bir ekilde akla kavusturulmas. -WHATSAPP BOT-Genel webdriver yaps-Kullanc arayzl uygulama gelitirme-Bu kursta python ile ilgili ilk bata temel ve kendinizi gelitirebileceiniz dzeyde giri dersi verdikten sonra orta ve ileri dzeyde ki konulara arlk verdim . Whatsapp bot yapma (bu ayn zamanda btn bot uygulamalarnn temeli olarakta saylabilir) bunun yansra gui yani kullanc arayz oluturmay ve  daha ayrntl ksmlara kendi arzunuzun istei lde aratrma ekillerini sizlerle paylatm.-Ayn zaman da python ile c++ , c# , java , swift gibi dillere hzlca adaptasyon.Python bilgisiyle hacking toollar yazabilme becerisi ve anlayabilme becerisi."
Price: 199.99


"Learn Turkish Online"
"During this course you will learn Turkish grammar in the easiest way. All the topics discussed here are enriched with examples and all subjects may be seen both in English and Turkish languages. Bu kurs boyunca Trke dil bilgisini en kolay biimde reneceksiniz. Ele alnan tm konular rnekler ile zenginletirilmi olup, tm konular hem ngilizce hem de Trke grlebilir."
Price: 59.99


"Instalador de cmeras de segurana (CFTV)"
"Esse curso ideal para profissionais que desejam obter uma renda extra instalando cmeras de segurana.As aulas so fceis de entender e permitem um aprendizado rpido mesmo para quem no tem nenhuma experincia na rea.O aluno ir aprender os fundamentos de cmeras de CFTV, processo de instalao, configurao e projetos pequenos negcios.O autor trabalha mais de 16 anos na rea e j ministrou cursos no Brasil e outros 17 pases."
Price: 139.99


"NETWORK ANALYSIS"
"basics of electrical networks determination of energy stored in circuits deals with Kirchhoff laws (KCL, KVL)source transformationcircuit reduction methods such as mesh and nodal analysis SPECIAL CASES: super mesh, super nodalwye-delta transformationnetwork theorems (thevenin's, Norton's, superposition, maximum power transfer, millman's, reciprocity)two port network parameters (Z, Y, H, T, inverse T, inverse H) and their properties the discussion"
Price: 19.99


"Google Hacks For Businesses: Introduction to Google Tools"
"If I could show you a way to make more income and increase your productivity using Google's tools and software, would you be interested?When you think of Google, you think of a search engine - and maybe an email provider. But the truth is, Google is a powerful tool. With hidden functions and the ever-so-popular G-Suite, Google gives business owners & entrepreneurs the opportunity to uncover keywords and niches, advertise their business, communicate and collaborate directly with clients and keep their information secure. Google Hacks For Entrepreneurs: Master The Search Engine is a step-by-step, over-the-shoulder walkthrough, designed to give you the tools to kickstart your business using Google. The first section is all about research. Even Google Search is more than a simple search engine. You will learn a little-known way to find low competition keywords that you can target, find which keywords your competitions are ranking for and how to prevent Google from indexing webpages you don't want index. The second section is all about Google Alerts & Trends. You will learn how the experts position themselves on social media with the latest breaking news for your niche and how to use Google Trends for your advantage. The third section is all about productivity. No business becomes sustainable without consistency and productivity! Turn your browser into a timer, manage your calendar correctly & use this secret tool to help you stay on top of your game.The fourth section is all about client communication. Google offers two extraordinary tools for client communication: Google Hangouts (perfect for high-ticket webinars!) and Google Groups. If you're teaching online, working with clients or using webinars to convert prospective customers, this is for you. The fifth section is all about G-Suite. Google allows you to keep your information safe, collaborate with other creators and even create promotional videos - all from the safety and comfort of your drive!The sixth section is all about Google's 'Under-The-Radar' business tools. Learn all about listing your business on Google (for free!), how to publish your blog & newsletters for free (!) and how to find global niches you can take advantage of. Clients from all over the world are at the tip of your fingers!The final section is all about advertising. Targeted traffic has never been easier! Whether you want to run your own ads using AdWords or profit off other people's ads using AdSense, this section will teach you how. Add an extra revenue stream or acquire new clients - the choice is yours! Please be aware this is a beginner's course. AdSense and AdWords are platforms that have in-depth, Google-certified courses dedicated to them. This is meant to help you get your ads and revenue up and running."
Price: 149.99


"Google Analytics Mastery of Custom Dashboards"
"Boost Revenue, Traffic and own your metrics. Understand more with the key performance indicators. Opt-in NowGoogle Analytics gives you the tools you need to better understand your customers. You can then use those business insights to take action.Get to know your customers.Get a deeper understanding of your customers. Google Analytics gives you the free tools you need to analyze data for your business in one place.See whats in it for you.Build a complete picture.Understand your site and app users to better evaluate the performance of your marketing, content, products, and more.Get insights only Google can give.Access Googles unique insights and machine learning capabilities to help get the most out of your data.Connect your insights to results.Analytics is built to work with Googles advertising and publisher products so you can use your analytics insights to reach the right customers.Make your data work for you.Process and share your data quickly with an easy-to-use interface and shareable reports.Designed to work together.Easily access data from other solutions while working in Analytics, for a seamless workflow that saves you time and increases efficiency.Google AdsGain deeper insights into how users from your Google Ads campaigns engage with your site.Data StudioConnect Analytics with Data Studio to easily build performance dashboards and create customized reports."
Price: 199.99


"Docker Essentials for Python Developers"
"Docker & Containers are Foundations of modern DevOps practices. These are must-have skills of Full Stack Developers.Containers have become a standard in Production-grade Deep Learning applications.Every Python Developer must be fluent and comfortable in using Containers at every step of Application lifecycle.You learn Docker and Containers quickly in this Course.It is designed to be very practical and give you Quick Win without spending too much time. I use Minimal Manual teaching approach: each idea, concept or skill has a dedicated Lecture. This way you are going to learn much faster.Here you learn everything important to prove you know Containers:How to build and run Containers with Python AppsContainerize Flask-based Microservices and Django Web AppsUse Docker in Data Science and Machine Learning EnvironmentsCreate complex Development & Test Environments with Docker ComposeYou are going to get practical results in first hour of using this Course!Don't wait any longer, start using Containers now!Course Introduction section is free to watch, as well as first Lectures of each Section. Check them out!"
Price: 99.99


 
Previous Next