Industrial Internet of Things or IIoT refers to interconnected instruments, sensors and other devices which can be networked together in an industrial setting. This connectivity enables remote access, efficient monitoring, data acquisition and collection, analysis and exchange of different data sources and a lot more. IIoT solutions have enormous potential for increasing productivity, and are also known for their low cost and quick implementation.
On the threshold of fourth industrial revolution, industrial organizations are investing more in IIoT to improve the operational performance, visibility and insights, which can help in streamlining the processes. Eliminating the complexity out of deploying, connecting and managing devices in industries is key to IIoT success.
Here are some of the key benefits that you can expect from industrial IIoT solutions:
Increase Efficiency
The top benefit of IIoT is that it provides the ability to automate, remotely monitor operations and make data-driven decisions, thus enhancing the operational efficiency.
Reduce Errors
Industrial IoT digitizes nearly all processes. By reducing manual procedures and entries, the risk associated with human errors is largely reduced.
Provide Predictive Maintenance
Machine and asset downtime can adversely impact industrial operations. Industrial IoT solutions can consistently monitor the performance and functions of various industrial assets and help in creating a baseline. This baseline along with corresponding data can empower the industries with the information that will enable them resolve pre-emptively any issues.
Improve Safety
Fully functioning IIoT solutions have integrated safety systems that uses data from monitoring and control devices to help in improving workplace safety. In case, any incident occurs, valuable data is obtained from these systems, which can help in preventing their repeated occurrence in the future. Wearables are also used in industrial IoT operations that keep tab on things such as the surrounding noise levels and employee posture, etc. and can instantly alert the employees when they do not follow proper safety procedures.
Reduce Cost
The knowledge gained through the IIoT solutions provide important data-driven insights which help in improving the processes, including designing, operation, manufacturing, marketing, sales and a lot more, thus, steering the business in profitable direction.
Top 5 practical applications of IIoT in industrial automation
1. Remote access of machines
With remote access to industrial machines, the service engineers and other stakeholders can conveniently access the machine from their current locations, check their log files on the PLCs and change settings if required. It will take only a few minutes to access the machine and find problem, which will save a time-consuming trip to the manufacturer’s site.
2. Update new functionalities on HMIs
New functionalities are added to the machines to make the job more efficient and fast. While the programmer implements this functionality in the control panel of the machine, the HMI software needs to be updated, and tested in order to launch the new functionality. In that case, HMI software updates can be applied remotely through secure network access over the internet. With the web-based virtual network connection, you are able to view and check the HMI functionality anytime on that IIoT platform.
3. Predictive analysis for machine maintenance
As with all hardware, even the IIoT enabled machines undergo wear and tear before finally replaced with new equipments. In such scenarios, active and regular maintenance is crucial to prevent downtime and decreased production output. Using cloud to collect, store and access information on the machine parts, maintenance engineers can keep track of the remaining useful life (RUL) for every asset. Automatic notifications can be sent to the right person if an asset reaches its maintenance limit. By analysing the potential problems via remote access and online diagnostics tools, you are likely to get the right spare parts.
4. Analyse and optimize industrial robot actions
Industrial robots can make repetitive work easy. IIoT features with remote access can change the robot program actions and get better insights of the log files. Video analysis can also help in improving the actions of certain robots. Access to live stream and IP camera recordings can make improvements far more easy and fast. A VPN connection can be set up easily for full network access to any device that is connected to the robot.
5. Manage building automation data from multiple locations
IIoT can be used to monitor and control the heating, lighting, energy consumption, fire protection, employee safety and many other systems for multiple buildings from a central location. The real-time machine data can be transferred to a central cloud application, using industrial communication networks.
If you are planning to automate your processes in a smart way, then IIoT is the way to go. IIoT is bringing forth new business models to increase revenue, while at the same time acting as a force multiplier for improved productivity and efficiency.To know more about how Utthunga can help you create a smart building or factory and improve your business productivity, efficiency, reliability and ROI, visit https://utthunga.com/.
Industrial Internet of Things (IIoT) is the next big thing that’s happening across the industrial sector. An offset of the IoT, this technology revolves around the use of sensors, devices and software for industrial automation. A survey by McKinsey Digital states that IIoT is set to make an economic impact of up to $11.1 trillion by 2025. Companies that invest in IIoT are estimated to capture a major share of the profit margins. The oil and gas industry is one of the industries where IIoT is expected to play a huge role, both in terms of optimizing processes and enhancing safety.
Below are the four IIoT advantages for oil and gas industry to leverage and add value to their integrated business strategies.
1. Asset tracking, monitoring & maintenance:
A typical oil and gas company has multiple refineries that need to be regularly inspected for maintenance and repair. Though it is very important, in many cases, the staff may not be able to do a thorough physical inspection due to various reasons. Setting up an equipment and connecting it to an IoT network can help reduce the need for manual inspections a great deal.
IIoT can be effectively used in monitoring the working condition of field devices, sensors, actuators, valves and other assets of the refineries. Sensors can be fitted on the pumps, pipes, filters, valves and other components. These sensors collect data regarding the asset operation, temperature, speed, pressure or other parameters based on pre-determined conditions. They transmit the data in real time to an external storage, that could very well be on cloud. An experienced technician can analyse the data collected by the sensors to identify if there is any malfunction or impending malfunction in the asset.
At the same time, the sensors also enable technicians to keep track of all the mechanical components of the machineries used in the refineries. It also enables managers to keep track of the replacement parts and spare parts. Based on the tracking details, they will have information regarding the exact location of the new spare/replacement part. Therefore, IIoT enables real-time asset tracking and monitoring, which is not possible during manual inspections.
Moreover, the data collected by the IIoT network allows proactive identification of possible issues. So, technicians can immediately go to the exact point where the anomaly was observed based on the data and do the needful.
The second major advantage of using an IIoT network is efficient and effective data management. Technologies such as cloud computing, standards based connectivity solutions, etc. help in better data management, which in turn reduces expenses and improves the profit margin. Integrated sensors collect data using industrial protocols from various assets present in the supervisory networks, plant networks, fieldbus networks and ICS networks. Cloud is used to aggregate, integrate and store data from different sources in different formats. Methods such as edge-analytics and edge-processing are used to analyse data and gain insights and information.
Two factors affect the informed decision-making process. First, is the need for reliable and accurate data, and second is the loss of experienced personnel due to organizational restructure or retirements. Through data analysis and remote monitoring, effective asset management and set up of maintenance programs will ensure that the decision makers streamline and optimise the rig operations.
Real-time data obtained from the IoT network can be used to improve the extraction process and drilling strategies. A study by Bain and Company shows that effective use of data management can help oil and gas companies improve their production by 6 to 8%. Also, with monitoring and maintenance, a lot of the unnecessary expenses can be cut down. Plus, the staff don’t need to spend a lot of time and effort on the field trying to identify possible causes of problems. The automated network brings all the data to their fingertips on their systems.
The supply chain and logistics is one place where IIoT can be very beneficial. Based on the data collected from various touchpoints of the network, managers can plan and schedule their procurements, supplies, and identify the best practises. Connectivity is a huge problem when pipelines and ships are transporting oil and gas. At such time, the stakeholders need to rely on satellite communication to transmit data. It is also difficult to regularly check the working condition and obtain regular updates on the oil pipelines or the ships. Low-power wide-area networks (LPWAN) can be installed in areas of pipelines that are difficult to access. Additionally, wired and wireless networks can be setup along the transportation lines to collect relevant data and transmit it to a cloud source via the satellite connectivity. The office staff will have reliable and accurate information that will help them organise the oil deliveries better.
4. Health & Safety:
One of the major areas where IIoT can play a huge role is health and safety of the employees as well as maintaining the carbon footprint of the refinery. Oil and gas drills and refineries are usually located in dangerous and far-flung areas away from the crowded cities. While the remote location makes it convenient to drill oil, the same distance can make it difficult when there are health and safety issues. A network of connected sensors can help get real-time data on what’s happening on the ground. Also, remote equipment monitoring and predictive maintenance can greatly reduce machine repairs and breakdowns. Data such as pressure, air conditions and other parameters captured by the sensors and real-time images captured by surveillance systems can help ensure that the highest safety standards is maintained at the job site. The workers at the drill site are asked to wear wearables with trackers to ensure their locations can be immediately identified and notified in case of emergencies. All this data can be used to reduce accidents and fatalities.
The IIoT data helps reduce spills, pipe leaks and accidents, which could cause environmental damages. It can also be used, to analyse and identify areas where the carbon footprint can be reduced.
IIoT is definitely the future of the oil and gas industries. This technology helps enhance operational efficiency, reduce costs and aid business growth by delivering real-time and accurate data. Utthunga is a tech-based company with expertise in industrial automation. We can set up an integrated IIoT system for all your oil drilling sites and refineries based on your specific requirements. Contact us to know more.
It does not matter if you are an experienced design engineer or a complete novice when it comes to an industrial PCB design. What matters is that for every piece of electronic appliance or industrial mechanical device, there is an printed circuit board (PCB) that plays an important role. These green boards provide the physical platform for the design of a complete electronics system with circuits and components together.
It is said correctly that the success or failure of any product’s function largely depends on the PCB layout product design and development. While PCB manufacturing (PCBM) and the PCB assembly (PCBA) are two different processes with their own unique requirements, they both have PCB design as a common factor.
The astounding level of technical advancements in embedded and semiconductor integration is resulting in shrinking the PCB designs. PCB layout and design services must consider the complexity and expectations of these designs that is reaching new heights.
In this blog, we outline the top ten tips or guidelines for industrial PCB product design and development.
1.Identifying the project need
The customer’s requirements dictates every embedded hardware design. This essentially involves thorough analysis of your project, budget, and requirements. Translating these requirements into an electronic form uses the technical skills and experience of an electronic design engineer. What is the type of your product, what is its operating environment; what are the powering options, communication mechanisms or the specifications/regulations does it comply? These are some questions that come to the PCB design engineers. Having these answers will help in identifying the best possible layout, initial schematic of the PCB, as well as the BOM and other details of your final PCB design.
2.Schematics:
The schematic diagram is the blueprint for PCB manufacturers and assemblers during the production processes. It is a diagrammatic representation of the electronic component symbols and the interconnections between them. Once all the design choices for the various components is finalised, this acts as baseline for developing the schematic design. Preliminary review and analysis is then performed to check for potential problems and then the corrected version is fed to the PCB design software, which can run simulations to ensure the correct functionality of the PCB. Some of the popular software are Eagle, KiCAD, Altium Designer and OrCAD.
3.Bill of Materials:
A bill of materials (BOM) is generated simultaneously along with the schematic’s creation. This is very handy as it lists out each components quantity, numerical values (like ohms, farads, voltage etc.), component part number, the reference designators and PCB footprint for each component. It defines how each component in the circuits is identified and located; the circuit element numbers used for purchasing and substitution, its size with respect to the PCB size etc. An updated BOM will save many complications in the later PCB design stage process.
4.Component placement:
Placing each component in its designated spot on a circuit board is the most critical part of designing. A wrong placement can cause various thermal, electrical noise and power variations in the circuits leading to malfunction of the PCB board and product failure. The design schematic generated earlier is used to determining the correct spot. The most common order of component placement followed by the designers are:
• Connectors
• Power circuits
• Sensitive and precision circuits
• Critical circuit components
• All other elements
The designer then verifies and reviews the initial component placement step and adjusts to facilitate routing and optimize performance. It is adjusted based on the cost and size factor also. Due considerations are given to
• PCB component placement with ground separation, power and isolation between high frequency components
• Place components with the same orientation for assembly easiness
• Print the layout to see if components’ sizes match to avoid any assembly and mechanical issue
At this point based on the size and cost, the placement and package sizes are often reconsidered and changes are made.
5.Thermal Management:
One of the most frequent design issue is thermal management of the PCBs. When heat dissipation is not considered, it leads to poor circuit performance or even a damaged board. Following are some tips to achieve high performance keeping the heating issues in mind:
• Place heat sensitive components (like thermocouples or electrolytic capacitors) away from heat generating components (like diodes, inductors, and resistors)
• Use PCB as heatsinks by using more layers of solid ground or power planes connected directly to heat sources with multiple vias
• Thermal vias dissipate heat from one side of the PCB to another. Having larger and multiple thermal vias lowers the operating temperature of components and contribute to higher reliability.
6.Wiring Considerations:
PCB wiring design is often a complicated process involving the power cord design, ground wire design and other wiring considerations. It is grouped into single wiring, double-sided wiring and multilayer wiring. It is good practice to follow some of the tips:
• Exchange wiring directions between layers. For multiple layers, alternate between directions
• Keep wiring and the distance between the wires uniform and equal
• Allow small loops in loop back wiring
• Ensure smooth soldering of the wiring
• Printed wiring corners should be rounded corners when used for circuit wire for dense or high frequency circuits
7.Mechanical Constraints:
Consider the mechanical constraints in advance before starting the placement and review during the embedded hardware design. There board materials used to create the PCB, the number of layers each PCB will have, the copper traces used to complete the connections, component selection based on availability, numbers, package type, cost and other determining factors. The PCB shape, size and board area are constrained by the PCB hosting cabinet used by the appliance or equipment.
8.Power Constraints:
Any huge current spikes or large voltage changes can interfere with the low voltage and current control circuits. A good power design guideline includes adequate separation, placement and coupling into its schematics.
In the PCB design, it’s recommended to use common rails for each supply, avoid longer power loop while routing and have solid and wide traces. Having separate power and ground planes internal to your PCB while also being symmetrical and centred will help to prevent your board from bending. You can place a small impedance path to reduce the risk of any power circuit interference and to help protect your control signals. The can be followed to keep your digital and analog ground separate.
9.Routing details:
A differential signaling is the process of using not just one, but two traces to transmit a signal on your board. They are (ideally) equal in magnitude and opposite in polarity, thus in the ideal case there is no net return current flow through ground.
There are many reasons for using a differential signaling like reducing the EMI, having precise timing to determine the current logic state a differential pair and separating the power systems.
To achieve the above benefits, it is recommended that you always route the differential signal traces in parallel, keep them as shortly and directly as possible between components. Also, maintain equal length and width of the traces while keeping the spacing constant between the trace pairs.
10.Perform Quality Checks:
One of the good practice of PCB layout and design services is to conduct a DFM and DFT checklist review before signing off your PCB designs for generating the Gerber files. The Gerber format is an open 2D binary vector image file format, which is used universally by the PCB design software. It describes the PCB images and are necessary to manufacture and assembly the final PCB boards. Some of the checklist involves verification of the component placements, routing paths, pre-scan for the thermal integrity and signal integrity among others.
Industrial PCB design can be a formidable task but by following the above basic techniques and best practices, it can lay the foundation for your success. Our expert product design engineering services can guide you with your next PCB design project with cost-effective services and assured quality standards.Contact us today to learn more about the industrial product design engineering services, the PCB design rules and assembly processes.
Industrial Internet of Things (IIoT) is the most promising trend in the Industrial Revolution. Industries are now leveraging the power of the internet to interconnect their systems to improve their overall communication process. This interconnection will produce the best results if there is a semantic flow of data and a well-defined architecture. This is where the challenge is for the industries.
OPC Unified Architecture (OPC UA) is an international standard for industrial communication between the various layers of the industrial pyramid. It is an IEC standard protocol and is a successor to the OPC Classic specification. What makes OPC UA different from OPC Classic is that it is platform-independent, unlike the latter, which could be used only on for Microsoft (OLE) systems.
OPC UA gained popularity in 2007, around the time of the introduction of Service Oriented Architecture (SOA) in industrial automation systems. Being IEC compliant, it can be easily implemented in areas where communication between devices is required.
One of the major challenges that I4.0 poses for industries is interoperability. OPC UA proves to be a solution through its semantic communication standard. This is important because the information transfer is mostly between devices and machines, which can hardly understand ambiguous instructions. The more precise instructions, the better result they will produce.
The main crux behind implementing the best OPC UA for your automation system is the choice of tools. Since the devices on IIoT or in any industrial automation environment are controlled by a software application, a well-functioning software development kit (SDK) is necessary. It ensures a quality user experience for end-users and software engineers.
Choose the right software development kit for OPC UA
The key to implementing an effective OPC UA depends on the right selection of software development kit. We have listed out ten points that an automation manufacturer, OEM, discrete and process manufacturer must take note of while choosing an SDK.
Right SDK vendor
Most of the companies lack enough resources, both technical and human. The best they can do is outsource their requirements. Therefore, the chosen SDK must be such that it meets their application requirements and improves the product’s time to market. While choosing the SDK it should be profitable both in terms of money and performance. Most of the SDK vendors offer the basic functions that enable the fundamental OPC UA benefits, like security and API, for better abstraction languages.
Scalability
A scalable SDK enables the implementation of OPC UA in all new and existing systems. Therefore, the manufacturers must consider a scalable SDK, which must accommodate any type of hardware, be platform-agnostic, OS, and vendor independent. This enables the platform-independent OPC UA toolkits to work efficiently in any environment be it a small embedded or a large enterprise-based application.
Ease of Use
This is one of the obvious yet overlooked factors. An SDK should be easy to use and understand, so the OEMs or small-scale manufacturers can save time and energy in understanding the in-depth knowledge of the OPC UA specification. It must deploy a simple application and provide integration using APIs.
CPU Utility
An OPC UA SDK if written based on the architectural principles for embedded systems, utilizes much less CPU. It means the application can perform significant work in a single thread as well. It comes in handy where the multi-threads aren’t available. This in turn reduces the overall cost, as a low-cost CPU can do most of the work, in such cases.
Good Memory
Software, of course, runs on memory. A good OP UA implementation should not have a huge footprint, and should be easy on RAM. Further, memory leaks can accumulate over time and bring the entire system down. It is imperative that there be no memory leaks (under all use case scenarios) in the OP UA SDK.
Compatibility and Security
OPC UA SDK toolkit must be compatible with a wide range of applications and security requirement. The O UA standard supports various security modes, an ideal SDK should support all of them.
Language Support
Even though C++ is the most commonly used language to write SDK codes, other languages such as Java, C, .NET, etc. are also used depending on the requirements. Developing an OPC UA SDK in different languages pose challenges to make incremental improvements to their products depending on the already available specifications like AMQP, Pub/Sub and UDP.
Third-Party Libraries
Third-Party libraries play a crucial role in the software application process. Most companies have their preferred library, therefore most SDK vendors offer wrappers like standard crypto libraries, use-case examples, manuals, and API reference to use wrappers like NanoSSL, mBed TLS, TinyXML2, and Lib2XML.
Accommodate Future Enhancements
While implementing any protocol, the manufacturers must ensure the SDK vendors are well equipped with knowledge and skills to maintain agility in performance owing to the ongoing developments around SDKs, and the OPC Foundation based technologies like AMQP Pub/Sub, UDP, and TSN.
Vendor Support
SDK vendors must be willing to support the manufacturers in every step of their OPC UA implementation with their expertise. A relationship based on trust, mutual benefits and understanding is key to an effective OPC UA implementation.
OEMs, discrete, and process manufacturers must ensure to work with a team that understands OPC UA specifications and implements them in their best interest.Utthunga offers the best OPC UA services focused to make our clients I4.0 ready. Our expert team consists of professionals recognized by the OPC Foundation, and are armed with the right expertise and knowledge for the implementation of OPC UA on any platform. Contact us to know more!
Industrial automation systems use Ethernet application layers technologies like EtherNet/IPTM and Profinet which often have issues related to bandwidth and payload. To overcome these shortcomings, Beckhoff Automation, a German automation company came forward with a Fieldbus system called the Fast Light Bus. This eventually evolved into EtherCAT (Ethernet for Control Automation Technology), which was launched in 2003 by the same company.
EtherCAT works on the basic principle of pass-through reading and combines the benefits of “on the fly processing” and its well-built infrastructure, some of which includes better efficiency and higher speed.
Utthunga’s industry protocol implementation experts understand this new concept of EtherCAT. We integrate the best EtherCAT practices that enhance your hardware and software communication. This in turn, enhances the overall productivity of the automation system.
Beckhoff promoted EtherCAT protocol through the EtherCAT Technology Group or the ETG. It works along with the International Electrotechnical Commission (IEC), which has led to the standardization of EtherCAT over the years.
The EtherCAT standard protocol IEC/PAS6246, which was introduced in 2003, has since then been standardized by IEC to became IEC 61158. One of the main features of EtherCAT is its versatility to work within most of the industrial plant setups. This flexibility allows it to be the fastest Industrial Ethernet technology suitable for both hard and soft real-time requirements in automation technology, in test and measurement and many other applications.
Another feature of EtherCAT is that the EtherCAT master mostly supports various slaves with and without an application controller. This makes the implementation seamless and succesful.
EtherCAT switches are another unique feature of the EtherCAT. Here the switching portfolio refers to all the managed, unmanaged, and configurable switch product lines. An example of the EtherCAT switch is the Fast Track Switching that offers perfect decision-making capabilities even in a mixed communication network under any circumstances. Most of these switches are quite economical to be used in the switch cabinets. They have robust metal housing and support via either VLAN support, IGMP snooping, or other SNMP management features.
To get the best out of the EtherCAT implementation in your industrial plants, you need to have a robust implementation strategy in place. For this matter, we have jotted down the strategies you can use to implement EtherCAT in the following lines:
EtherCAT infrastructure
The EtherCAT infrastructure is quite powerful as it includes various safety and communication protocols and includes multiple profile devices. If we go deep into the architecture, we see that the EtherCAT master uses a standard Ethernet port and network configuration information. The data can be easily fetched from the EtherCAT Network Information file (ENI). The EtherCAT Slave Information files (ESI) that are unique for each device and are provided by the vendor form the basis of these ENI.
If you are working on a load-dependent servo task, the location (Master or Servo Drive) plays a key role in selecting an EtherCAT mode of operation.
EtherCAT Slave & EtherCAT Master devices
EtherCAT slave devices are connected to the master over the Ethernet. Various topologies of EtherCAT can be implemented to connect slaves to the Ethernet Configuration tool. The Ethernet configuration tool is connected to the EtherCAT master via the above mentioned EtherCAT network information file. This configuration tool plays a pivotal role in implementing EtherCAT and connecting the slaves to the master in the right way. The tool generated a network description known as the EtherCAT Network Information file based on the EtherCAT Slave Information files and/or the online information at the EEPROM of the slave devices, including their object dictionaries.
Several EtherCAT slave devices work synchronously with the EtherCAT master devices through various tuning methods. Here the tasks such as setting outputs, reading inputs, copying memory, etc. can be considered, wherein the synchronization between the logical level of the devices plays an imperative role. In order to implement an EtherCAT slave with a master, all you need is an EtherCAT master that works on the lines of a standard Network Interface Controller (NIC, 100 MBit/s Full duplex) protocol. To seamlessly integrate master and slave, a master software with actual run time drives the slaves.
EtherCAT Operation Modes and Network Topology
One of the challenging parts of implementing EtherCAT to your control system is choosing the right operation modes. The reason being, each mode demands differently from the operating system and master. The possible cases of dynamic changes of loads or custom control loop algorithms and the demands of an “on the fly processing system” needs to consider while choosing the operating mode. Here we are going to give you a brief idea of three of the important operating modes that are: CAN over EtherCAT, File over EtherCAT, EtherNet over EtherCAT.
CAN over EtherCAT (CoE)
One of the most widely used communication protocols, the CANopen, is used in this mode. It defines specific profiles for different devices. This operating mode ensures a higher speed EtherCAT network.
File over EtherCAT (FoE)
This operating mode gives you access to data structure or the information files in the device. This enables the uploading of standardized firmware to devices. This does not depend on whether they support other protocols like TCP/IP.
Ethernet over EtherCAT (EoE)
This operating mode allows communication between Windows client applications with an EtherCAT device server program via Ethernet that uses the EtherCAT network. It the simplest way master and slave connect and reduces the overall implementation time.
EtherCAT is one among the many Ethernet based fieldbus protocols but has garnered significant popularity for industrial automation applications. These are optimized for industrial devices like programmable logic controllers (PLCs), I/O, and sensor-level devices. Implementation of EtherCAT offers higher efficiency of the control system, reduction in error, and connect 65,553 nodes in the system with low latency in each slave node.
Utthunga’s services help you to implement EtherCAT in the best possible ways so your industrial automation systems can enjoy the benefits from its versatility to the fullest.
Industrial revolution 4.0 has already set in, and industrial automation is a profound part of it. One of the crucial aspects of implementing a successful automation ecosystem in any industry is seamless communication between devices. For a long time, the traditional field buses like PROFIBUS, HART, FF, Modbus and a few others have been the been the standard communication solution for field layer connectivity.
However, with the ubiquity of Ethernet in the layers above sensors/PLCs and to take advantage of the IT tools and technologies in the OT layer, Ethernet is increasingly being looked as a communication bus in the field layer also. this has led to the idea of the Ethernet Advanced Physical layer (APL).
Ethernet-APL is a subset of the widely used Ethernet standard and it is describes the physical layer of the Ethernet communication specially designed for industrial engineering services. With high communication speeds over long distances and a single, twisted-pair cable for power and communication signal supply, this layer proves to be a robust solution for a better bandwidth communications link between field-level devices and control systems in process automation applications. In simple terms, Ethernet-APL is the upgraded link between Ethernet communication and instrumentation.
Ever since BASF, a German chemical company and the largest chemical producer in the world successfully tested Ethernet APL for the first time in 2019, many companies have successfully implemented the same in various IIoT networks. In February 2020, ABB’s trials proved that Ethernet APL effectively eliminates gateways and protocol conversions at various industrial network levels.
Ethernet-APL makes infrastructure deployment a seamless process as the devices connected over it share the same advanced physical layer. This also indicates that it enables devices in the industrial network to be connected at any time, irrespective of where they are placed in the factory or processing plant.
There are numerous reasons why industries willing to integrate IIoT must consider Ethernet-APL. We have discussed them in the next sections.
Ethernet-APL enables seamless integration of various processes and creates effective communication between the control and plant field devices for long distances process variables, secondary parameters, and asset health feedback and seamlessly communicating them over long distances.
Some of the major benefits of incorporating Ethernet APL in industrial automation applications are:
Improved Plant Availability
In addition to pure process values, modern field devices provide valuable additional data. With Ethernet-APL, plant operators can make the most of the devices in real-time, centrally monitor their components’ status, and identify maintenance requirements early on. This avoids unplanned downtime and increases plant availability significantly.
Cost-Effective Plant Optimization
Ethernet-APL supports the trunk-and-spur technology established in the process industry and is applicable to any industrial Ethernet protocol such as EtherNet/IP,HART-IP, and PROFINET. This simplifies integration for planners, plant designers, and plant operators since existing installations and infrastructures can still be used and investments are protected.
Adds Flexibility to the Plant
IEEE and IEC standards layout communication protocol, testing, and certification of products to implement Ethernet-APL into any plant automated systems in any part of the world. This way, in an industrial environment, devices from different manufacturers, irrespective of their state of origin, can have interoperable communication within the working ecosystem.
Coherent Communication at all levels
Ethernet-APL allows a common communication infrastructure for all levels of process management. This is because field devices can be easily connected to the higher-level system. The high transfer speed of 10Mbit/s and the full-duplex infrastructure make it suitable for data transmission over a length of approximately 1000 m.
APL – For IIoT Applications
The Industrial Internet of Things is undoubtedly an integral part of the industrial automation workspace. Therefore, the high-speed, industrial Ethernet-based Ethernet-APL is touted as the future of industrial communication systems. Many of the leading communication protocol associations like the OPCFoundation, ODVA, PROFIBUS, and PROFINET International are in the process of supporting APL, which makes it compatible with any existing processing system.
It supports 2-WISE (2-wire intrinsically safe Ethernet) and therefore eliminates the need for numerous calculations, which makes it simpler to verify the intrinsic safety of devices within the Ethernet-APL automation network.
Ethernet-APL comes as a blessing for the manufacturing and process industry in particular, as they lacked a standard network capable of high-speed transfer of data within field devices irrespective of their implementation level in the Industry 4.0 architecture.
How APL is Serving the Special Requirements of Process Industries
Ethernet-APL is specially crafted for process industries. Since these industries involve works at hazardous and explosive areas, deployment of industrial Ethernet seemed like a far thought for quite long. However, with the introduction of an advanced physical layer into the Ethernet, 2-WISE became a reality.
The 2-WISE infrastructure makes it safe to be deployed in such hazardous areas. This improved the overall plant availability and brought remote access to many devices in the process industry 4.0.
Conclusion
Advanced Physical Layer or APL has brought in a new ray of hope for effective adoption and implementation of IIoT in the industries. Utthunga’s innovation-driven team is ready to support you in your APL plans. Get in touch with us and get the best industrial engineering services that elevate the efficiency of your plant and plant assets for increased ROI.
The term “simulator” means “imitator of a situation or a process”. In the digital sense, we can say that a protocol simulator or a network simulator is a computer-generated simulation of a protocol before bringing the product to the market.
There is a paradigm shift in industries like industrial OEMs, discrete, power, and process utilities to move towards automation. This implies more interconnected devices over the internet with interlinked communication between the devices. In order to carry out a reliable and seamless automated working ecosystem in IIoT, many foundations like OPC, ETG, PI and others have laid down certain industrial protocols that a product must follow.
Protocol testing is a crucial element that product engineering companies like Utthunga take care of. It is imperative as it checks the alignment of the hardware or software product with the industrial protocol standards. This helps to address an issue, be it a design glitch, or points out the challenges in implementing it. Protocol simulation is a part of product testing and it helps to check if a hardware or software is working as per the communication protocol standard and purpose.
Protocol simulation is mainly carried out for checking the accuracy and latency of the communications over the wire. It is done by creating scenarios that are similar to the real-time use cases. These mimic the exact situations that are similar to the real-time use cases and help you evaluate the possible risks and challenges associated with the product. Knowing these before its release helps you create a product that stands apart in quality among your competitors.
How Simulation Can Save Your Product Development Time And Cost?
Simulation can be carried out in various ways, it all depends on your ultimate goal. If a reduced product development time and cost is on your checklist, then you can use the simulation approaches that we have listed down:
Protocol simulation to test for design reliability
In industries, especially in the current automated ecosystem, the device which you manufacture must be in compliance with the industry standards. When you create a device prototype and simulate it to test the design capabilities, you get to interact with the unknown design features and may discover some loopholes as well. This saves you product development time, as you optimize your product before it reaches the market. This way you can fix the glitches and then move on mass production.
Finite element analysis
Industrial devices are subjected to a lot of unpredictable scenarios and stresses that requires your product to be robust enough to handle such unforeseen situations. The finite element analysis helps you to validate your product in this context. It ensures your product can endure unpredictable stresses (in the connectivity/communications context) up to a certain limit. You can carry out FEA even in the design stage, to get a real idea as to what to expect from your product and the areas which need improvements.
It helps to improve the reliability of the product before an untested product reaches your customers and ruins your brand image. It also makes manual testing a lot easier.
APIs in protocol simulation allow easy integration of your product to various software frameworks. This means test engineers can leverage better test automation solutions to carry out protocol simulations with high precision. Utthunga’s protocol simulators are configurable as a server-side application in your industrial devices. So, it enables remote control of the devices through various programming languages like Python, Java, C++, and others.
Industries have complex systems. Protocol simulators like master simulators and slave simulators when used in the product development cycle, help them to create a reliable product.
Since such a simulator is capable of providing practical feedback at the designing stages itself, it comes across as a time and cost saver. It also empowers design engineers to understand the possible glitches in the design and create an optimum layout for the same.
These allow simulation of the required prototype at the luxury of the lab, or during R&D or engineering. The control systems can be built to test the devices in various load and real-time scenarios. These can run on a desktop and be integrated with the control systems and another master systems that communicates with these field devices. Therefore the overall infrastructure, cost to procure, deploy and maintain the devices can be considerably reduced.
In research and development, these protocol simulators act like a perfect aid to train the operational personnel and get an in-depth knowledge of the functionalities of the product. It also helps the R&D department to come with innovative ideas for creating a better product that matches the growing demands of the users.
Conclusion
A protocol simulator helps create a virtual representation of the product even in its design stage. It helps design and product engineers understand the dynamics of the device’s operation at each phase of the production cycle.Choosing the protocol simulator, therefore, should be a well-thought decision, if you are keen on creating error-free, top-quality devices. Utthunga’s protocol simulator is carefully created by our panel of experts who have gained years of experience in this field. Get in touch with our team, to know more about our exceptional services tailored to get you Industrie 4.0 ready. Utthunga has deep capabilities in industrial protocols, and our protocol simulators are an extension of Utthunga’s rich and deep protocol expertise. All our protocol simulators are built on top of our uSimulate framework – tried and tested in the field for years. We support several protocols like Modbus, EtherCAT, IEC-104, GE-GSM and others. Adding a new protocol (legacy or proprietary) to the simulator family is fairly easy as well.
Open Process AutomationTM Standard (O-PASTM Standard) or “Standard of Standards” as it’s popularly known is an initiative to create a new age automation system with a different architecture than the existing process automation systems that uses Distributed Control Systems (DCS) and Programmable Logic Controllers (PLCs). As automation applications require ultra-high availability and real-time performance, process automation systems have always been highly proprietary. The reason behind developing this standard is to transform from a closed, proprietary, distributed control systems towards a standards-based open, secure and interoperable process automation architecture.
In 2016, The Open Group launched the Open Process AutomationTM Forum (OPAF) to create an open, secure and interoperable process control architecture to:
Facilitate access to leading-edge capacity
Safeguard asset owner’s application software
Easy integration of high-grade components
Use an adaptive intrinsic security model
Facilitate innovation value creation
This blog aims to show why and how OPC UA can be applied to realize the Open Process AutomationTM Standard. Before that, let us be familiar with the Open Process AutomationTM Forum. In simple terms, The Open Group Open Process Automation™ Forum is an international forum that comprises users, system integrators, suppliers, academia, and organizations.
These stakeholders work together to develop a standards-based, open, secure, and interoperable process control architecture called Open Process AutomationTM Standard or O-PASTM. In version 1 of O-PASTM, published in 2019, the critical quality attribute of interoperability was addressed. In version 2, published in January 2020, the O-PASTM Standard addressed configuration portability, and version 3.0 will be addressing application portability.
Version 1.0 of the O-PASTM Standard unlocks the potential of emerging data communications technology. Version 1.0 was created with significant information from three existing standards:
The seven parts that makeup the latest preliminary 2.1 version of O-PASTM Standard are:
Part 1 – Technical Architecture Overview
Part 2 – Security (informative)
Part 3 – Profiles
Part 4 – Connectivity Framework (OCF)
Part 5 – System Management
Part 6 – Information Models based on OPC UA (Multipart specification ranging from 6.1 to 6.6)
Part 7 – Physical Platform
Part 1 – Technical Architecture Overview
This informative part demonstrates an OPAS-conformant system through a set of interfaces to the components.
Part 2 – Security
This part addresses the cybersecurity functionality of components that should be conformant to O-PASTM. This part of the standard also explains the security principles and guidelines incorporated into the interfaces.
Part 3 – Profiles
This part of the version defines the hardware and software interfaces for which OPAF needs to develop conformance tests and ensure the interoperability of the products. A profile describes the set of discrete functionalities or technologies available for each DCN. They may be composed of other profiles, facets, as well as individual conformance requirements.
Part 4 – O-PASTM Connectivity Framework (OCF)
This part forms the interoperable core of the system, and OCF is more than a network. OCF is the underlying structure that enables disparate elements to interoperate as a system. This is based on the OPC UA connectivity framework.
Part 5 – System Management
This part covers the basic functionality and interface standards that allow the management and monitoring of functions using a standard interface. The system management addresses the hardware, operating systems, and platform software, applications, and networks.
Part 6 – Information and Exchange Models
This part defines the common services and the common information exchange structure that enable the portability of applications such as function blocks, alarm applications, IEC 61131-3 programs, and IEC 61499-1 applications among others.
Part 7 – Physical Platform
This part defines the Distributed Control Platform (DCP) and the associated I/O subsystem required to support O-PASTM conformant components. It defines the physical equipment used to embody control and I/O functionality.
The O-PASTM Standard supports communication interactions within a service-oriented architecture. In automation systems, it outlines the specific interfaces of the hardware and software components used to architect, build, and start-up automation systems for end-users. The vision for the O-PASTM Standard V2.0 addressed configuration portability and can be used in an unlimited number of architectures. Meaning, every process automation system needs to be “fit for a reason” to meet specific objectives.
Why OPC UA is important for Open Process AutomationTM Forum
The lower L1, L2 layers of the automation pyramid is heavily proprietary with a tight vendor control over the devices where the PLC’s, DCS, sensors, actuators and IO devices operate. This is where the vendors have strong hold over the end-users. As a revenue generating path, they are reluctant to lose this advantage. Additionally, this poses interoperability, security and connectivity issues causing significant lifecycle and capital costs for the stakeholders.
This inherent lack of standardization in the lower OT layers is a constant pressure point for the automation industry. O-PASTM Standard solves this standardization & connectivity issue and uses OPC UA as one of the foundation for developing this standard. This de-facto standard is used for open process automation integrating controls, data, enterprise systems and serves as a fundamental enabler for manufacturers.
Building the basic components of this standard (like DCN, gateways, OCI interfaces, OCF) using OPC UA helps them achieve secure data integration and interoperability at all levels of the IT/OT integration. This involves leveraging the OPC UA connectivity (Part 4 of O-PASTM and information modeling capabilities (Part 6 of O-PASTM) which play a key role in the O-PAS™ reference architecture.
How O-PASTM leverages OPC UA
From the below architecture diagram it’s evident that a Distributed Control Node (DCN) is the heart of the OPAF architecture. Here a single DCN is similar to a small machine capable of control, running applications, and other functions for seamless data exchange with the higher Advanced Computing Platform (ACP) layers. This component interfaces with the O-PASTM Connectivity Framework (OCF) layer that is based on the OPC UA connectivity framework.
The connectivity framework allows interoperability for process-related data between instances of DCNs. It also defines the mechanisms for handling the information flow between the DCN instances. The framework defines the run-time environments used to communicate data.
Basically each DCN has a profile which describes a set of full-featured definition of functionalities or technologies. For example:
The DCNs (i.e. O-PAS conformant components) are built conforming to anyone of the primary profiles specified in the O-PASTM:
OBC
O-PAS Basic Configuration
OCF
O-PAS Connectivity Framework (OPC UA Client/server, OPC UA PubSub profiles)
OSM
O-PAS System Management
NET
Network Stack
CMI
Configuration Management Interface
SEC
Security
DCP
Distributed Control Platform (Physical hardware)
The OPC UA information model capability is used to define and build these DCN profiles. Part 6 of the O-PASTM and its subparts defines related set of information and exchange models, such as basic configuration, alarm models, or function block models. This provides a standard format used for the exchange of import/export information across management applications. It also provides standard services used for the download/upload of information to O-PASTM conformant components.
According to the report OPC UA Momentum Continues to Build published by the ARC Advisory Group and endorsed by the OPC Foundation, it provides timely insights into what makes OPC UA the global standard of choice for industrial data communications in process and discrete manufacturing industries. From an IIoT and Industry 4.0 perspective, the report examines how the OPC UA technology is the standard that solves the interoperability challenges.
Key take-away from the report that help maximize OPC UA adoption include:
OPC UA standard is open and vendor agnostic, and the standard and Companion Specifications are freely available to everyone.
OPC UA is an enabler for next-generation automation standards that will, potentially change the industry structure of process automation e.g. Ethernet Advanced Physical Layer (Ethernet APL), NAMUR Open Architecture, and the Open Process Automation Forum (OPAF)
OPC UA is arguably the most extensive ecosystem for secured industrial interoperability
OPC UA is independent of underlying transport layers. As such, it uses the most suitable transports for the right applications (ex. TCP, UDP, MQTT, and 5G)
OPC UA is highly extensible via its Information Modeling (IM) capabilities. This makes OPC UA an excellent fit for use by automation vendors and other standards organizations wishing to express and share semantic data seamlessly across all verticals.
The OPC Foundation Field Level Communications (FLC) Initiative is defining a new OPC UA Field eXchange (OPC UA FX) standard that is supported by virtually all leading process automation suppliers.
OPC UA FX will extend OPC UA to the field level to enable open, unified, and standards-based communications between sensors, actuators, controllers, and the cloud.
Forward-looking companies should make OPC UA a crucial part of their long-term strategies today because the changes this technology brings will become a necessity faster than most people anticipate
OPAF is making outstanding records in creating a comprehensive, open process automation standard. Since it is partially built on other established industry standards like OPC UA, the O-PASTM Standard can improve interoperability in industrial automation systems and components.
OPAF fulfills its mission to deliver effective process automation solutions with the collaborative efforts of the OPC Foundation. Utthunga’s expertise in OPC UA standard and by adopting our OPC related products and solutions, businesses can benefit from low implementation and support costs for end-users and enable vendors to experiment around an open standard.
Get in touch with our OPAF experts to experience a new-age open, secure by design and interoperable process automation ecosystem.
Industrial revolution 4.0 has already started to show signs of significant change in various industrial operations. From manufacturing, to automotive, finance and production, every business process is being explored to unveil the potential of automating them.
Industries are thriving hard to stay in tune with the latest technological advancements and be relevant in the digital era. The popularity of software-based automation for industrial units therefore has seen a sharp rise. According to a survey, the industrial control and factory automation market are expected to reach USD 229.3 billion by 2025 from USD 151.8 billion in 2020, at a CAGR of 8.6%.
The I4.0 brings in a lot of improvements in the manufacturing industry. OEMs, in particular, are embracing the rapidly changing technology, and are implementing software that needs timely up-gradation with the inclusion of new features.
Even though the changes work for the betterment of the system, it may also bring unwanted alteration to the existing features. Hence, proper regression testing is required to check if the changes does not alter the intended purpose of the system.
Regression testing uses the requirement specifications as basis for creating test cases and looks for any bugs or fault in the software system. As more and more OEMs and factory process are drifting towards remote functions and software implementation, this testing helps them to improve the overall quality of the software.
Improve efficiency: An OEM with an error-free software ensures precision in its operation. Regression testing constantly checks for deviations in the software each time a modification is made.
Better monitoring and decision-making process: In some cases, especially when dealing with a complex software, OEM tends to lose track of the code modification. Regression testing makes it easier, as it keeps a record of all the changes made. This in turn aids in proper monitoring of the changes and decision-making process related to the deployment of the final software.
Reduces unnecessary manufacturing costs: Regression testing identifies the errors and notifies the OEMs to fix them in the early stages itself. A bug fix in the production/manufacturing stage of the product life cycle will result in huge manufacturing costs. Regression testing ensures the final product will be error-free.
Continuous operation: A crucial aspect in the successful deployment of I4.0 is the assuring the interconnectivity and automation of the devices. Regression testing ensures the bugs are fixed and all the interconnected devices work together seamlessly.
There are different ways regression testing can be carried out. Based on your requirements and the complexity of the software, a proper regression mechanism is chosen.
In industrial automation, devices need to be connected together. Here, with every additional device, the software may need changes in its code or features. The testing here ensures that the introduction of the new device or an upgrade does not alter the functions of an existing setup.
In an OEM unit, regression tests are mostly executed at the design stage to find the immediate bugs and at the production stage to decide whether the quality of the product matched the specification of the customer.
If there needs to be a functional change in any of the devices, corresponding codes need to be changed, Here the regression testing helps in producing the desired outcome.
To keep up with an evolving market, the manufacturing industries and industrial automation in particular are working in an agile environment. The DevOps culture is being widely accepted by the industrial automation companies for on-time and efficient deployment of new software technologies.
The constant upgrades and features introduced by OEMs can change the way the whole system works. This brings in an agile environment where continuous change comes with a high amount of risk.
Risks involve fatal bugs, repeated errors, duplicate entries, etc. These all culminate to either non-delivery of the product or a delay in deployment. Both these cases can be avoided by continuously keeping a check on the code source and its impact, through regression testing.
Benefits of Regression Testing In Agile Environments
OEMs and factory processes are focusing to blend in an agile environment to build a better technology-enabled workspace. This along with the current DevOps culture has helped industrial automation to create a digital identity of its own even in the times of cutthroat competition.
Regression testing helps OEMs to manufacture more reliable products and provide better services. Apart from this obvious benefit, some of the crucial ones are listed below:
Since the software testing and development team can easily identify the bugs, they are motivated to deliver high-end bug-free device
Each case is handled and verified differently, therefore, it ensures a seamless functional process
It ensures the bugs are fixed and the products are ready to be launched in the market.
A bug free software ensures better communication between the interconnected devices in an automation system
Conclusion
The future of industrial automation belongs to agile environment and DevOps. These not only offer a better coping mechanism to the changing scenarios but also are crucial in delivering services with utmost precision. With big data and artificial intelligence seeing new heights, industries are sure to leverage them in software testing to bring the best out of the agile and DevOps culture.Catch up with the most effective testing solutions offered by Utthunga. Contact us to know more.
As the Industrial Internet of Things is taking hold, we are seeing more and more desktop/software electronics being used to build smart devices, machines, and equipment for manufacturing OEMs. These devices are the “things” in IIoT and form a connected ecosystem and are at the core of the digital thread.
Desktop/software product development, therefore, holds an important place in the adoption of IIoT. Here selecting a reliable platform is crucial in deciding the overall time to market and overall cost of production and quality. Test automation services and simulations are being widely used in conjunction to produce reliable and stable desktop/software devices.
Simulation refers to the process where a sample device model is simulated to perform under practical conditions, uncover the unknown design interactions, and gain a better perspective of possible glitches. It helps the test automation to streamline the defect identification and fixing process.
An automated testing process includes simulation and testing together to improve the overall efficiency of the desktop/software device. In the current technological epoch, smart test automation is a “smarter” option to create reliable desktop/software devices from the ground up.
Smart Test Automation- a Revolution for Desktop/Software Applications
Industrial automation is at the core of Industrie 4.0. The inclusion of smart devices into the automated industrial network has made many manual work processes easier and more accurate. With the emergence of software-driven desktop/software systems, the industrial automation sector is witnessing a tectonic shift towards a better implementation of IIoT.
As the dependence on these devices increases, desktop/software device testing should not be an afterthought while implementing the big picture. That said, carrying out multiple tests in an IIoT environment where the number of desktop/software systems is increasing can be challenging. To improve the overall accuracy, bespoke smart test automation for desktop/software devices is required.
Smart test automation is the platform wherein the desktop/software devices are tested to understand their design interactions and discover possible glitches in the device operation. This is very important as it ensures that the product does what is expected out of it. This innovative approach has with time, proven to show spectacular results wherein the desktop/software applications work more effectively, thereby improving the overall efficiency of the IIoT systems.
How does Test Automation help to build sound Desktop/Software Devices?
Desktop/software application testing is often misunderstood as software testing. However, both are quite different. Desktop/software product testing involves validation and verification of both hardware and firmware testing. The end goal is to create a desktop/software device that meets the user requirements. Automated desktop/software device testing works well for this purpose, as it involves a lot of iterations, and tests the firmware and hardware requirements. Below are some of the advantages of automated testing, which makes it a class apart from manual testing:
Improved productivity
One cannot deny that manual testing means a highly stressed QA team and higher risks of human errors. Having an automated testing system in place takes the stress off from the QA team to a great extent, as it allows a seamless feedback cycle and better communication between various departments. Also, it facilitates easy maintenance of the automated tests logs. These reasons culminate in a reduced product-to-market time with a highly productive team. A happy workforce and a smooth testing system form the backbone of a quality end product.
Reduced Business Costs
Multiple errors, multiple tests, reruns, all these may seem trivial, but with times they cumulate to increase the business costs. Automated test algorithms help to reduce the business costs, as they are designed to detect failures or glitches in the design in the earlier stages of the desktop/software product development. This means you will require lesser product test reruns as compared to manual testing.
Improved Accuracy
This is one of the major advantages you can leverage from a smart test automation setup. It eliminates human error, especially in a complex network. Even though the chances of computer-driven errors persist, the rate of errors is reduced to a great extent. It leads to accuracy that is sure to meet the customer demands and keep them happy.
Assurance of Stability
Automates testing helps you to validate the stability of your product at the earliest phase of product development before its release. The manual stability tests often take a lot of time and can be hampered by human errors. Automated testing helps you curate a format to get automated updates of the status of the product through the relevant database.
Smart Test Automation- Challenges and Tips
With a complex smart testing process, automation comes with greater challenges. However, most of them can be resolved if you have the expertise and knowledge to implement the right strategies to get the best out of the smart test automation setup.
Some of the challenges are:
Lack of skilled professionals to handle technology-driven testing algorithms.
Not everyone has the skills to perform automated tests to the fullest. You either hire a skilled professional or train your employees to adapt to the automated testing culture.
Lack of proper planning between the teams
One crucial aspect that decides the success of automated testing is good teamwork. Your teams need to work collaboratively to ensure stability in the tests. You can try out the modular approach to achieve this, where tests are built locally over a real device or browser. The teams can then run them regularly and map out the results and coordinate in a better way.
Dynamic nature of automated tests
This is quite common, as companies are yet to inculcate agility in their processes. It is required for the successful implementation of these tests. One way to overcome this is to start with baby steps and then scale your testing process later as the situation demands.
Undoubtedly, smart test automation is the future of desktop/software devices. For efficient implementation of automated test systems, we at Utthunga provide you with the right resources and the right guidance. Our experienced panel is well versed with the leading technologies and has the perfect knack to pick out the right strategies that would aid your growth.
Leverage our services to stride ahead in the Industrie 4.0 era!
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.