DevOps Skills

The broad picture. Skills to address the “from Code to Infrastructure” paradigm. Bridging ends from code producers to deployment in production – mindset of all involved, get a sense of the process as well do the automation of it and the orchestration and monitoring.

Collaborate with internal management teams involved in the DevOps process and stay familiar with the objectives, roadmap, blocking issues and other project areas.
Have the skills to mentor and advise team members on the best ways to deliver code, what tools to use when coding and how to test the latest features.

The target. Fast provisioning: be able to setup new machines fast. Good monitoring: to be quickly able to diagnose failures and trace them down. Quickly rollback to a previous version of the microservice. Rapid app deployment through fully automated pipelines. Create the Devops mindset / culture.

DevOps engineers need to know how to use and understand the roles of the following types of tools:
1. Version control: GitHub, GitLab
2. Continuous Integration servers: code coming in repository server and triggers build and doc: Jenkins, GitLab CI, Atlassian Bamboo, Circle CI, GitHub Actions
3. Configuration management: Software Configuration Management SCM Tools: Configuration management occurs when a configuration platform is used to automate, monitor, design and manage otherwise manual configuration processes. System-wide changes take place across servers and networks, storage, applications, and other managed systems: Puppet, Ansible, Chef
4. Deployment automation: Ansible Tower, Bamboo
5. Containers: containerd, Docker, Artifactory
6. Infrastructure Orchestration: automating the provisioning of the infrastructure services needed to support an app moving into production – in the right order, is orchestration: Terraform, Ansible (also Config. Management Tool), Chef, Kubernetes
7. Monitoring and analytics: Prometheus, Datadog, Splunk
8. Testing and Cloud Quality tools: a test automation platform uses scripts to automate the whole process of software testing. Identify the tests that need to be automated. Research and analyze the automation tools that meet your automation needs and budget. Based on the requirements, shortlist two most suitable tools. Do a pilot for two best tools and select the better one. Discuss the chosen automation tools with other stakeholders, explain the choice, and get their approval. Proceed to test automation
Tools: Kobiton, Eggplant, TestProject, LambdaTest
9. Network protocols from layers 4 to 7, nginx, caching, Service Mesh.
10. Programming skills with Java, Shell, Python, JS, Ruby…

Also:
Monitoring production environments
Performance measurements
Security
Cloud administration
Get proper alerts when something is wrong or unavailable
Help resolve problems either through online support or technical troubleshooting

Job Breakthroughs

Startup vs. Larger Company:
Working for a smaller company is that you get to make more of an impact: Working in a larger corporation might have more benefits or a higher salary but a startup is where you can really make a difference and see the influence your work is having on the business. You’re heavily involved in each stage of production and your opinion is more likely to carry weight than at a larger, more structured operation. Decentralization of big companies would be done through tokenization. The shares will be done through ICOs.
Jobs in IT:
In Artificial Intelligence, the Internet of Things, data security, virtual reality and augmented reality, virtual worlds (and virtual assets) and bank-less, free nodes back-boned, Internet of payment. Jobs to see as or related to: big data engineer, Software 2.0 Engineer (maintain Neural Networks that write code), full-stack developer, security engineer, IoT architect and VR/AR engineer and hybrid engineers, with agile mindsets through the teams, with solid technology stacks knowledge that working together are able to bind different ends of the domain spectrum (similarly like DevOps is to the “from Code to Infrastructure” mindset paradigm), runners of decentralized Internet (sustained by Blockchain and other similar technologies yet to come, in order to back-up the Virtual Assets in the Virtual Worlds in the Decentralized Network).
Thus the skills needed to succeed in the IT jobs of tomorrow revolve around security certifications, programming and applications development, proficiency with cloud, decentralized architectures and mobile technologies, and other specialized skill sets giving also way to the hybrid IT roles that bind the business to IT.
Roles grow vertically based on business domain vs. technology stacks. For example: a Solutions Architect has the business domain knowledge but has also a technical background. He will develop complex technology solutions in a specific business domain. Software Architect knows in a deeper way the technology stacks. He will design the architecture of the technical implementation. Technical Lead is one with deeper knowledge of the, or a part of the technology stack. He designs using established patterns, coaches teams into the adopted technologies and unlocks teams in order to succeed in project delivery.
Data Scientists: it is essential for data scientists to work with languages like R, Python, SAS, Hadoop, Netezza in which they apply their knowledge in statistics, mathematics (algebra), matrices (multivariable) calculus. And to have a knowledge in platforms like MapReduce, GridGain, HPCC, Storm, Hive, Pig, Amazon S3.
The user as valuable “in the network” resource, in parallel digital universes (eg. Metaverse). Their actions should be monetized and generate income. We are producing valuable data even now by only navigating on FB, Google and other social networks which the system themselves uses it to become better (the long therm plan is building the future AI systems together). The “Internaut” will be one of the nicest job of the future.

IIoT Platforms

GE’s Predix, Siemen’s MindSphere, and the recently announced Honeywell Sentience are likely to be on any short list of industrial cloud platforms. But they aren’t the only ones in this space. Cisco’s Jasper, IBM’s Watson IoT, Meshify, Uptake, and at least 20 others are competing to manage all those billions of sensors that are expected to encompass the Industrial Internet of Things (IIoT).

Sample providers: Amazon AWS, AT&T M2X, Bosch IoT, Carriots, Cumulocity, GE Predix, IBM Watson IoT, Google Cloud IoT Core, Intel IoT, Cisco Jasper, Losant IoT, Microsoft Azure, PTC ThingWorx (connected to Windchill/PDMLink), SAP Hana Cloud, Thethings.io, C3IoT, Uptake, Amplia IoT, XMPRO, Meshify, TempoIQ, Bitstew Systems, Siemens MindSphere, AirVantage, Honeywell Sentience, Schneider Electric’s Ecostruxure, Alibaba Cloud will roll out its big-data service, called “MaxCompute”, and Parker Hannifin’s Voice of the Machine IoT platform.

GE Predix: is a platform-as-a-service (PaaS) specifically designed for industrial data and analytics. It can capture and analyze the unique volume, velocity and variety of machine data within a highly secure, industrial-strength cloud environment. GE Predix is designed to handle data types that consumer cloud services are not built to handle.

Siemens MindSphere is an open platform, based on the SAP HANA (PaaS) cloud, which allows developers to build, extend, and operate cloud-based applications. OEMs and application developers can access the platform via open interfaces and use it for services and analysis such as the online monitoring of globally distributed machine tools, industrial robots, or industrial equipment such as compressors and pumps. MindSphere also allows customers to create digital models of their factories with real data from the production process.

Honeywell Sentience is the recently announced cloud infrastructure by Honeywell Process Solutions. It is a secure, scalable, standards-based “One Honeywell” IoT platform, that will be able to accelerate time-to-market of connected solutions, lower the cost-to-market, and enable new innovative SaaS business models. It will have the ability to run global security standards embedded throughout the solution and make applications that are plug & play and scalable.

C3 IoT is a PaaS that enables organizations to leverage data – telemetry from sensors and devices, data from diverse enterprise information systems, and data from external sources (such as social media, weather, traffic, and commodity prices) – and employ advanced analytics and machine learning at scale, in real time, to capture business insights for improved operations, enhanced customer engagement, and differentiated products and services. C3 IoT is led by Silicon Valley entrepreneur Thomas Siebel. It has closed deals with the U.S. State Department and the French utility ENGIE SA, based on C3 IoT’s focus on machine-generated data.

Uptake: is a predictive analytics SaaS platform provider that offers industrial companies the ability to optimize performance, reduce asset failures, and enhance safety. Uptake integrates data science and workflow connectivity to provide high-value solutions using massive data sets. In 2015, it entered into a partnership with heavy construction equipment manufacturer Caterpillar to jointly develop an end-to-end platform for predictive diagnostics in order to help Caterpillar customers monitor and optimize their fleets more effectively.

Meshify is an Industrial IoT platform for tracking, monitoring, analyzing devices. The Meshify suite of tools provides all the features needed to deploy, monitor, control, and analyze the results of an IoT solution. Despite being a young technology business, it has a growing portfolio of clients with industrial-oriented companies, including Henry Pump, Sierra Resources, Stallion Oilfield Services, Gems Sensors & Controls and MistAway Systems.

http://www.iotcentral.io/