More Filters. Combinatorial auction based multi-task resource allocation in fog environment using blockchain and smart contracts. Peer-to-Peer Netw. View 1 excerpt, cites background. Auction method to prevent bid-rigging strategies in mobile blockchain edge computing resource allocation. Future Gener. Computer Science, Business. Fog-Integrated Cloud Architecture enabled multi-attribute combinatorial reverse auctioning framework. Highly Influenced. View 4 excerpts, cites background and methods.
View 2 excerpts, cites background and methods. View 1 excerpt, cites methods. Highly Influential. View 5 excerpts, references background. IEEE Access. View 1 excerpt, references methods. View 3 excerpts, references methods and background. View 2 excerpts, references background. Blockchain-based fair payment smart contract for public cloud storage auditing. View 1 excerpt, references background. View 2 excerpts, references background and methods. They are also able to debug much faster, reducing the customer impact of any issues.
For example, when a cryptocurrency network forks, the team is alerted to the issue and is able to update the node before the issue is noticed by users, whereas before they would first learn about the problem from users.
Not only do they now have enough granularity to effectively use metrics for debugging individual devices, the biggest value for these large operations is the ability to aggregate metrics and get an organizational overview.
This allows them to get information about things like power consumption broken down by regions in the data center. Most Cudo clients end up using the metric dashboard to improve cost optimization and monitor hardware health. Thank you for submitting your contact details.
Provide additional information for our first call. Tell us a little about yourself, current environment, and areas of interest and we'll connect you with a Chronosphere expert who can share more about the product. We will be in touch within hours. Please describe any pain points about your existing solution or specific things you would like to understand about Chronosphere.
From crypto mining to fog computing Cudo started as a cryptocurrency mining platform, providing a desktop application that made it easier for users to mine the most profitable coins according to their hardware, the network and other factors like exchange rates. Share this: Icon Twitter Icon Linkedin. Other resources you may be interested in. Interested in what we are building?
Join us Now. This field is for validation purposes and should be left unchanged. Full Name. What best describes your use case:.
Subscribe to our Blogs and read at your own pace Please leave this field empty. Please leave this field empty. No spam, we promise. You can update your email preference or unsubscribe at any time and we'll never share your details without your permission. Sign in. Forgot your password? Get help. Password recovery. Home What are the Issues with Fog Computing? First, let us know what fog computing is?
A downside of cloud computing is that all this computing over the network relies heavily on data transport. While broadband internet access has generally improved over the last decade, there are still challenges with accessibility, peak congestion, lower speeds on mobile 3G and 4G cellular networks, as well as occasions of limited internet availability whether underground, off the grid or on an airplane.
This lack of consistent access leads to situations where data is being created at a rate that exceeds how fast the network can move it for analysis. This also leads to concerns over the security of this data created, which is becoming increasingly common as Internet of Things devices become more commonplace.
Physically, this extra computing power closer to the data creation site in a fog computing configuration gets located at a fog node, which is considered a crucial ingredient in a cloud-fog-thing network. The fog node, which is located in a smart router or gateway device, allows for data to be processed on this smart device, so that only the necessary data gets further transmitted to the cloud, and decreases the bandwidth used.
An example of a use case for fog computing is a smart electrical grid. Electrical grids these days are quite dynamic, being responsive to increased electrical consumption, and lowering production when it is not needed to be economical. In order to run efficiently, a smart grid relies heavily on real time data of electrical production and consumption. Fog computing is ideal for this as in some cases the data is created in a remote location, and it is better to process it there.
In other situations, the data is not from an isolated sensor, but rather from a group of sensors, such as the electrical meters of a neighborhood, and it is better to process and aggregate the data locally, than to overload the data throughput by transmitting the raw data in its entirety. Each vehicle has the potential to generate quite a bit of data just on speed and direction, as well as transmitting to other vehicles when it is braking, and how hard. As the data is coming from moving vehicles, it needs to be transmitted wirelessly on the 5.
A key component of sharing the limited mobile bandwidth is the processing of data at the level of the vehicle via a fog computing approach through an on-board vehicle processing unit.
|Fog computing cryptocurrency||Cardano crypto indiana gun club|
|What is erc ethereum||526|
|Btc cash wallent||150|
|Fog computing cryptocurrency||Furthermore, technical details and algorithms on the simulated integration are provided. Interested in the Universe. On the other hand, perhaps the energy spent can be partly redirected toward other important applications. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc. Then there was the internal monitoring. Satoshi Nakamoto.|
|0101 btc to usd||South korea crypto tax|
|Fog computing cryptocurrency||Cryptocurrency sci fi|
|Interest rate of bitcoin||If two machines running a deterministic task produce results that don't match, then one machine may be malfunctioning or its owner may be fog computing cryptocurrency. Additionally, SONM is suitable for mining any kind of cryptocurrency, offering consumers a minimum basic income from mining. But opting out of some of these cookies may affect your browsing experience. Cryptocurrency symbol. Blockchains and Smart Contracts for the Internet of Things. Such validation should save a great deal of costs and efforts on researchers and companies adopting this integration.|
However, the same feature makes it difficult to manage and accumulate data in large scale networks such as IoT [ 24 ]. There are four major attributes of Cloudlet: entirely self-managing, possesses enough compute power, low end-to-end latency and builds on standard Cloud technology [ 25 ]. Cloudlet differs from Fog computing as application virtualization is not suitable for the environment, consumes more resources and cannot work in offline mode as indicated by [ 26 , 27 ].
Micro-data centre [ 28 ] is a small and fully functional data centre containing multiple servers and is capable of provisioning many virtual machines. Many technologies, including Fog computing, can benefit from Micro data centres as it reduces latency, enhances reliability, relatively portable, has built-in security protocols, saves bandwidth consumption by compression and can accommodate many new services.
Fog computing can enable users to take full control and management of the network by providing Network Level Virtualization NLV and real-time data services. OpenPipe [ 29 ] utilises Fog computing to implement NLV through a hybrid model, which consists of virtual Software Defined Network SDN controller located in Cloud , virtual local controllers located in Fog , virtual radio resources for wireless communication and virtual cloud server. The SDN controller is a global and intelligent module, which manages the entire network.
Local controllers forward data to an SDN controller, which fulfils the demand of real-time and latency-sensitive applications by deciding whether to process data on local or SDN controller, based on user policies. The benefits of proposed system include load balancing, handover event without compromising Quality of Service QoS , low energy consumption, and reduced latency and low network overhead. In addition, Fog nodes can compress and reorganize the web objects for optimal speed.
In addition, various compelling research studies [ 30 — 32 ] have been presented for improving the performance of SDN and virtual machines by making use of cloudlets, which are able to perform dynamic VM synthesis, single-hop low-latency wireless access and creates the VM overlays to only load the difference of desired custom VM and its base VM.
These features have been implemented by Carnegie Mellon University in a project called Elijah and is available on Github repository [ 33 ]. The use of highly virtualized environment results in a large number of shared technology security issues. For example, an insecure hypervisor can be exploited to bring down the entire Fog platform as it is a single point of failure and manages all the Virtual Machines [ 34 ].
The risks associated with shared technology are critical because it takes a minor vulnerability or misconfiguration to damage all Fog services, user operations and allows attackers to gain access to exploit Fog resources. Researchers from Cisco are utilising Fog computing to increase the performance of websites [ 37 ].
Instead of making a round trip for every HTTP request for content, style sheets, redirections, scripts and images, Fog nodes can help in fetching, combining and executing them at once. In addition, fog nodes can distinguish users based on MAC addresses or cookies, track user requests, cache files, determine local network condition. In another similar paper, Fog computing significantly reduced the response time of a Cloud-based temperature prediction system [ 38 ].
Due to Fog systems, the prediction latency was decreased from 5 to 1. Fog nodes are able to manage cache e. Another simple approach [ 41 ] would be to use Edge computing for generating user-specific pages by replicating the application code at multiple edge servers. The edge servers are capable of keeping numerous copies of data, perform content-aware data caching and content-blind data caching.
Using Fog platform for optimising web-services will also introduce web security issues. For example, if user input is not properly validated, the application becomes vulnerable to the code injection attacks, such as SQL injection, where SQL code provided by the user is automatically executed resulting in the potential for unauthorised data access and modification. Similarly, due to insecure web APIs, attacks like session and cookie hijacking posing as a legitimate user , insecure direct object references for illegal data access, malicious redirections and drive-by attacks [ 43 ] could force a Fog platform to expose itself and the attached users.
Web attacks can also be used for targeting other applications in the same Fog platform by embedding malicious scripts cross-site scripting and potentially damage sensitive information. A potential mitigation mechanism is to secure the application code, patch vulnerabilities, conduct periodic auditing, harden the firewall by defining ingress and egress traffic rules and add anti-malware protection.
Mobile applications have become an integral part of modern life and their intensive use has led to an exponential growth in the consumption of mobile data, and hence the requirement for 5G mobile networks. Fog computing can not only provide a 5G network with better service quality, but they can also help in predicting the future need of mobile users [ 44 ].
Inherently, Fog nodes are distributed within the proximity of users; a characteristic that reduces latency and establishes adjacent localized connections. Broadly speaking, the diverse and multiple topological and mesh network connections among Mobile network, Fog nodes, and Cloud platform make Fog system beneficial for 5G technology, NLV and SDN [ 45 ]. Fog computing is also able to handled load balancing issues of a 5G network [ 46 ].
When many users are simultaneously requesting computation in a large-scale network, creating small cells of Fog nodes based on the size of requested task and system parameters can improve load balancing. Edge computing is also being used for reducing network latency, ensuring highly efficient service delivery and offering an improved user experience by utilising programmable nature of NLV and SDN [ 47 ]. Without properly securing the virtualised infrastructure of Fog nodes in a 5G network, providers risk not being able to achieve the desired performance.
A single compromised Fog node in the 5G mobile network can generate the potential entry point for a Man-in-the-Middle MITM attack and interrupt all connected users, leak data, abuse the service by exceeding the limit of data plan and damage sibling Fog nodes. A MITM attack can be launched by a malicious internal user and can exploit the Fog platform by sniffing, hijacking, injecting and filtering data incoming from the end-user [ 48 ].
This will consequently affect the data communication of the underlying network E. The most common way of eliminating such issues is to encrypt communication with either symmetric or asymmetric algorithms, mutual authentication, using the OAuth2 protocol, and ensuring the isolation of compromised nodes and certificate pinning as discussed by [ 49 ].
By deploying Smart Grids, large amounts of data is collected, processed and transmitted from smart meters using data aggregation units DAU. Meter data management system MDMS use the generated data to forecast future energy demands. According to [ 50 ], the data aggregation process takes a long time due to the low bandwidth capacity of hardware, but can be improved with the help of Fog computing. First, a Fog-based router is connected with smart meters that accumulate the data reading of all sub-meters within a pre-defined time.
Secondly, all values are transmitted to a second Fog platform, which performs data reduction processes. This Fog-based approach was tested on a general purpose Cisco routers and IOx, which are able to distinguished between Fog and non-Fog network packets. This method creates Advanced Metering Infrastructure AMI that can reduce the amount of communication data and overheads within the network, resulting in an improvement in response time.
A similar architecture is created in [ 51 ] for AMI, where Fog computing helped in reducing latency, delay jitter and distance while improving location awareness and mobility support. Although sophisticated database software and high storage capacity hardware are used for aggregation and processing, data can easily be replicated, shared, modified and deleted by any malicious intermediate or fake external node using a Sybil forging identities attack, which can undermine the CIA of data [ 52 ].
In addition, it is difficult for a Fog platform to centrally define, set and maintain access control attributes of user ownership in a large amount of moving data. Fog nodes are continuously processing, analysing and accumulating data to produce information and it becomes difficult to retain data integrity and prevent data loss.
The tolerance at which a failure occurs is also very low as the exact point of error is hard to identify in a system. To eliminate these issues, security policies and strategies should be integrated into Fog systems to track energy consumption information along with contingency plans and disaster recovery modules [ 53 , 54 ].
Fog computing is also applied in healthcare and elderly care systems, where self-powered wireless sensors transmit data to Fog nodes, as a pose to sending them directly the Cloud. Using a large number of sensors, it is possible to create a smart healthcare infrastructure, where semantic tagging and classification of data is performed in the Fog layer, providing the refined data to a Cloud system for further processing [ 55 ].
Another system uses a similar approach and integrates a Fog-computing-informed paradigm within a Cloud for medical devices, providing a good Quality of Service QoS and governance [ 56 ]. With the help of Fog computing, healthcare systems provide services from a nearby location, store heterogeneous data, consists of smart low power devices, and are able to switch among various communication protocols as well as facilitating distributed computing [ 57 ].
Another application of Fog computing in healthcare includes Electrocardiogram ECG feature extraction to diagnose cardiac diseases [ 58 ]. This involves medical sensors transmitting data to a Fog layer that stores data in distributed databases, extract ECG features, and providing a graphical interface to display results in real-time. The detection of a person having a stroke is of key importance as the speed of medical intervention is life critical.
Both systems distribute computational tasks between Fog and Cloud platforms to provide an efficient and scalable solution, which is essential as it allows for a quick detection and notification of a patient fall. Patient health records contain sensitive data and there are multiple points in any Fog platform where they can be compromised, such as by exploiting any system and application vulnerability, unauthorised data access while in storage or during transmission, malicious insiders threat and while sharing data with other systems [ 61 ].
Medical sensors are continuously transmitting data to Fog platforms, through either wired or wireless connection. It is quite possible to compromise patient privacy, data integrity and system availability by exploiting sensors and their underlying communication network. Wireless sensors usually work in open, unattended and hostile environments.
This ease-of-access has the potential to increase the chances of attacks like DoS, report disruption, and selective forwarding attacks [ 62 ]. In addition, if the Fog node manages sensitive data and lacks access control mechanisms, it might leak the data due to account hijacking, unintended access, and other vulnerable points of entry.
To avoid such issues, strict policies should be enforced to maintain a high-level of control using multi-factor or mutual authentication, private networks and partial selective encryption. Fog computing can play an important role, where the efficient processing and instantaneous decision-making is required. Take an example of tracking multiple targets in a drone video stream as stated in [ 63 ].
Instead of sending live video feeds to a Cloud-based application, it is directed towards the nearest Fog node. Any mobile device such as tablets, smart-phones and laptop can become Fog node, run tracking algorithms and process raw video stream frames, hence removing the latency of transmitting data from the surveillance area to the Cloud. The surveillance video processing can also be performed by using Edge computing and its potential in finding missing children [ 64 ].
Pushing video feeds of every camera sensor directly to the Cloud is not possible, but with the help of distributed edge servers and their processing power, each video can be processed individually and the Cloud system can gather the final results to yield a much faster output. Proximal algorithm [ 65 ] can also be implemented in the Fog nodes of a large-scale video streaming service, and can resolve joint resource allocation issue.
A video data stream generated by a camera sensors is sent to the respective Fog nodes, where it is stored and processed. The privacy of the stream should be maintained as it contains audio and visual data, which are transmitted to heterogeneous clients. Here, not only is the security of Fog node is important, but the network and all end-user devices involved in the transmission should also be considered, especially against APTs.
If a Fog platform or network contains any bugs due to lack of diligence, the crucial video stream might be viewed, altered and even destroyed. It is important that Fog node ensures a secure connection between all communicating devices and protect multi-media content by obfuscation techniques, fine-grained access control, generating a new link for video stream, selective encryption and limiting the number of connections [ 66 ].
RSUC is a group of Fog devices that performs data forwarding operations. Fog nodes and other devices communicate in the form of policy rules and content. Other similar implementations have been proposed in [ 6 , 68 ], where either Fog devices are connected centrally with SDNC and Cloud or interconnected with each other in a Machine-to-Machine manner.
To increase road safety, a Fog-based intelligent decision support driving rule violation monitoring system [ 69 ] has also been developed. The proposed system has three layers: lower, middle and upper. The lower layer is able to detect hand-held devices during driving and vehicle number using camera sensors, and send the information to nearest Fog server. In the middle layer, Fog server confirms if driver is intentionally violating the rules and communicates the vehicle identifier information to Cloud server.
Finally, in upper the layer, Cloud server issues a traffic violation decision and alert the relevant authorities. The security issues of Fog platforms in vehicular and road networks are similar to those associated with 5G mobile networks in terms of issues resulting from shared technology. Furthermore, vehicular networks do not have fixed infrastructure, and due to the volume of connections, there are multiple routes between the same nodes. Such networks are exposed to potential DoS and data leak attacks due to a lack of centralized authority [ 70 ].
DoS attacks on a Fog platform, either from end-users or external systems, can prevent legitimate service use as the network becomes saturated. In addition, all communication is wireless and hence susceptible to impersonation, message replay, and message distortion issues [ 71 ]. Protection from these attacks is significant as human life is involved.
The most common way of eliminating such issues is by implementing strong authentication, encrypted communication, key management service, perform regular auditing, and implement private network and secure routing. Fog computing is also being used as a solution for food traceability management, where the aim is to remove poor quality products from the supply chain using value-based processing [ 72 ]. A food item can be physically traced using various attributes, such as location, processing and transportation devices.
The quality of a food item is determined by distributed food traceability through Cyber Physical System CPS , which makes decisions based on Fuzzy rules. Both food traceability and quality information is sent to the Fog network, where the entire food supply chain is traceable. At this point, the Fog network holds complete information about all tracked food items and subsequently transmits food quality information to the Cloud system which can be viewed by stakeholders using the Internet.
The attackers could obstruct supply chain operations by exploiting location and transportation processes of this system. If a Fog node is compromised by means such as account hijacking or exploiting system and application vulnerabilities, the data can be falsified, which could ultimately result in the sale of substandard and low-quality food products. A network containing a large number of wireless sensors, and Machine-to-Machine M2M communications instigates a broad range of security concerns.
One such example is resonance attack, where sensors are forced to operate at different frequencies and transmit incorrect data to a Fog node. This attack impacts the real-time availability of network and data, along with tolerance level [ 73 ]. Such systems should be protected by integrity checks, detecting deception attacks, redundancy to prevent single-point of failure.
Instead of transmitting the entire audio data, FIT extracts features like volume, short-time energy, zero-crossing rate and spectral centroid from speech and sends to the Cloud for long-term analysis. The application was tested on six patients and Fog computing made it possible to remotely process large-amount of audio data in a reduced duration. Another work extends the features of Mobile Edge Computing MEC into a novel programming model and framework [ 75 ] allowing mobile application developers to design flexible and scalable edge-based mobile applications.
The developer can benefit from the presented work as the framework is capable of processing data before its transmission and considers geo-distribution data for latency-sensitive applications. Smartphones and tablets host large amount applications and can result in many complexities in terms of quality and security. Malware-based attacks can potentially corrupt and damage the CIA of data and communication.
A recent survey identified that there are many potential security solutions, such as anti-virus, firewall, Intrusion Prevention System, constant data backups, software patching, and frequently creating system restore points and performing behaviour analysis techniques through dynamic monitoring [ 78 ]. A real-time brain state detection system has been implemented using a multi-tier Fog computing infrastructure [ 79 ]. The Fog platform is the data hub and signal processor that receives and processes data streams generated by electroencephalogram EEG headset and motion sensors.
The Fog server extracts time-frequency characteristics from signals and dispatches them to the brain state classifiers. The benefits of the proposed system are demonstrated through playing a multi-player online game called EEG Tractor Beam. Another similar system is developed in [ 80 ], where a multi-tiered Fog and Cloud system, linked data, and classification models have been used for EEG-based Brain-computer interfaces BCI.
The Fog servers are used for real-time data processing, caching, computation off-loading, managing heterogeneity and forwarding data from mobile devices and sensors to the Cloud system. Fog computing also have many potential applications in telehealth systems [ 81 ], which can perform quick mining and perform analysis on a raw data stream gathered from different wearable sensors. Fog nodes compress data and are physically located nearby, aiding to reduce bandwidth and power consumption.
Essentially, every Fog system should consider appropriate user access controls, data encryption and Transport layer security TLS protocol [ 82 ] to secure data access, privacy, and transmission. If any sensor device, Fog node, network or even all are compromised by attacker due to some vulnerability or lack of diligence, the original data will remain disclosed.
Currently, brain signals acquired by an EEG sensor are used to play games, which do not require high security. However, for future sensitive applications, it is vital to implement encryption algorithms such as Elliptic curve cryptography to protect against Advance Persistent Threats APTs and data loss threats. Apart from enabling advanced technologies, Fog computing can perform many system-level tasks such as computation resource management, prediction, estimation and reservation.
It can also perform data filtration based on policy, pre-processing and enhance security measures. A similar framework has been provided by [ 83 ] for IoT devices resource management in micro data-centres. It consists of six layers:.
The framework also contains a resource estimation and pricing model for new IoT customers. Another article [ 84 ] suggests that Fog computing can enable dynamic real-time analysis, integrated security, reliability and fault tolerance. The Fog platform is highly flexible and scalable as processing nodes mobile devices can frequently join and leave a network.
This property also allows the support for more programming models and diverse system architectures to quickly manage substantial data. Another critical threat is that of the malicious insider, who can violate access control on user-to-user, user-to-administrator, administrator-to-user and administrator-to-administrator levels.
As virtualized environment are loaded into memory, it can also be exploited by resource abuse privilege escalation and escaping attacks , account hijacking exploiting authentication protocols or social engineering and DoS attacks due to large number of users requesting resources use at the same time. Such attacks could result from inefficient and insufficient resource policies as well as a lack of user activity monitoring.
In this case, identity-based encryption algorithms [ 85 ] and Role-Based Access Control model, as suggested by NIST [ 86 ], can be implemented to increase security. As Cloud operations require large amount of continuous energy, different types of applications are investigated in [ 87 ] using Raspberry Pi based servers, which can be installed and configured as a Fog platform to reduce energy consumption.
According to the results, applications that continuously produce static data within end-user premises and have low connection rate e. The authors also claim that the consumption of energy mostly depends on the amount of idle time, number of downloads, updates and data pre-loading, whereas actual content and number of network hops among users do not have vital impact.
They also designed a numerical model to prove that Fog computing significantly improves the performance of cloud computing by trading power consumption-delay with workload allocation. Similarly, to reduce the energy consumption in mobile-phones, researchers used used call graph to offload computation to edge servers by optimally managing and allocating communication resources [ 89 ]. This particular application encourages the use of Fog platforms in storing and processing specific user-defined kinds of the private data locally in the Fog nodes, reducing the communication cost and delay.
However, the presence of such private data puts the Fog platform in a sensitive position. As previously mentioned there are many threats, which are capable of compromising CIA of data such as malicious insiders can read, alter and delete data. These issues can be resolved through the use of encryption, authentication uniquely validating and verifying each user , data classification based on sensitivity, monitoring and data masking [ 90 ].
Fog computing can aid human search and rescue operations conducted over large geographical area in the occurrence of natural disaster [ 91 ]. Different Quality of Service QoS metrics such as energy consumption, mobility, localization, optimal path calculation, data distribution among Fog devices and performance are measured in the simulated post-disaster model to evaluate the system. Similar work suggests that VM-based Cloudlets [ 92 ] and tactical Cloudlets [ 93 ] can offer significant benefits in hostile environment e.
Disaster recovery is a sensitive area whereby Fog systems and connected devices are supposed to work in extreme circumstances. In this case, the integrity and availability of the system are more important than confidentiality. Wireless security protocols can carry out checksum detect data errors , encrypt packets with minimal resources [ 94 ] and provision fine-grained access control to strictly validate users terminating unwanted connections. Furthermore, in case of emergency and key management to prevent losing decryption keys, these mechanisms should be considered to retain availability and integrity without compromising the overall performance of system.
Table 1 presents the relationship of the surveyed Fog application areas and the categories of security issues. Although the table has been populated based upon interpreting published literature, it should be noted that in some cases it is possible that the authors may not have communicated specifics of their application which mitigate a potential security threat category.
The table identifies that none of the surveyed application areas have taken the necessary precautions to minimise the potential impact and risk of each category of security threat. Table 2 provides a summary of security controls in respect to each application area. This table highlighting the potential impact on Fog platforms with respect of CIA model. The development of security measures in Fog systems is rapidly progressing, and some of the current publications do not contain sufficient detail to provide a thorough evaluation.
This results in some of the knowledge gaps being speculative and futuristic and based on the latest research activity. It is important to note that due to continuous increase in attack vectors, it is not an exhaustive list and some security issues may have been missed. With the advancement in Fog infrastructure development, new security issues will need to be identified and acknowledged. As determined in the above sections, the introduction of Fog platform functionality between end-users and the Cloud systems creates a new point for vulnerabilities, which can potentially be exploited for malicious activities.
Unlike for Cloud systems, there are no standard security certifications and measures defined for the Fog computing. In addition, it could also be stated that a Fog platform:. Has relatively smaller computing resources due to their very nature and hence it would be difficult to execute a full suite of security solutions that are able to detect and prevent sophisticated, targeted and distributed attacks;.
Is an attractive target for cyber-criminals due to high volumes of data throughput and the likelihood of being able to acquire sensitive data from both Cloud and IoT devices; and. Is more accessible in comparison with Cloud systems, depending on the network configuration and physical location, which increases the probability of an attack occurring. However, it has also been identified that in most cases potential security measures against that can be implemented to mitigate threats are ignored.
A potential reason for this is that the security issues facing Fog systems is an infant research area, and only few of solutions are available to detect and prevent malicious attacks on a Fog platform. The below section provides an overview of such systems. Research into preserving privacy in sensor-fog networks [ 95 ] consists of the following summarised steps to secure sensor data between end-user device and Fog network:. Fuzzing of data by inserting Gaussian noise in data at a certain level of variance to lower the chance of eavesdropping and sniffing attacks;.
The system also includes a feature reduction ability for minimising data communication with Fog nodes to help minimise risk. This work is of significance as it focussed on preserving personal and critical data during transmission. The proposed technique can be improved by selecting an encryption and key management algorithm, focussing on those that play an important role in maintaining the privacy of data. In addition, there is little discussion on the required computational overheads for performing extensive data manipulation fuzzing, segregation, encryption, decryption and ordering, re-ordering before and after the communication.
This could be of significance when designing and producing a Fog system as the required computation overheads might not be available. Another important aspect to notice here is that sensors transmit data continuously, possibly over longer periods of time, and the proposed privacy framework might overload or even crash the underlying Fog system. One study [ 96 ] provides a solution for protecting data from malicious insiders using components of Fog and Cloud computing.
It combines behaviour profiling and decoy approaches to mitigate security threats. If any profile exhibits abnormal behaviour, such as the increase of accessing different documents at unusual times, the system will tag the access as suspicious and block the respective user. Decoy is a disinformation attack that includes fake documents, honeyfiles, honeypots and other kinds of baiting data that can be used to detect, confuse and catch the malicious insider.
This research domain is significant as it demonstrates potential altering and mitigation methods to defend against data theft. However, the experiment is performed with a limited amount of data. More specifically, eighteen students from a single university over the duration of four days. Hence, the results in terms of accuracy they claim might not reproducible or universal. Their technique can be improved by increasing the population size and running the experiment over longer timespan [ 97 ].
Furthermore, the computational requirements of such an approach are not mentioned. The paper provides no details on the quantity of data that is stored, as well as the CPU time and memory required during analysis. Such behaviour profiling techniques are often performed in a traditional client-server architecture where computation resources are freely available. It is not clear how this technique is able to be executed on a Fog node without having adverse affects on core functionality.
The technique can be further improved through critically analysing and selecting feasible machines learning techniques and training data required for behaviour profiling. This carries more importance due to the presence of a large number of user and files.
Similar behaviour profiling and decoy techniques are used in other works [ 98 , 99 ] to detect and prevent malicious insider threat. The behaviour profiling, monitoring and user matching process would not exert any burden on Cloud resources and prevent actual data theft without exposing any sensitive data.
As an added benefit, all of these operations will occur on-premise and execute relatively faster due to low bandwidth latency. One piece of work introduces a preliminary policy management framework for the resources of Fog computing to enhance secure interaction, sharing and interoperability among user-requested resources [ ].
The system is divided into five major modules:. AA is responsible for defining rules and policies stored in PRep while considering multiple tenants, applications, data sharing and communication services. When a certain service request is made from a user, it is sent to a PR that identifies the user based on specific set of attributes and access privileges against a requested resource.
The user attributes and their respective permissions are stored in a database. Despite being in an initial phase, this policy framework has potential to become an integral part of real-time distributed systems in future, where there is a strong need for access, identity and resource management abilities. However, this framework is limited to only those systems, which are able to allocate dedicated resources within Fog platforms for the bulk of computations required by various modules to execute the framework.
Fog platforms should be capable of handling highly time-sensitive applications, but the proposed validation process might take longer to make decisions. Another flaw in their technique is that the solution itself is inherently vulnerable to DoS attacks due to the complex authentication process in PR and PDE. However, these security concerns can be reduced by building a performance model that is collecting values of memory, CPU and disk utilization and periodically comparing with estimated values [ ].
In case the system identifies an anomaly, the user would be redirected to the Shark Tank cluster, which is essentially a proxy to closely monitor the user but can grant full application capabilities. Insecure authentication protocols between Fog platforms and end-user devices have been identified as a main security concern of Fog computing by [ 19 ].
Results show that the attack did not cause any visible change in memory and CPU consumption of Fog node, hence it is quite difficult to detect and mitigate. The authors recommend that the risk of such attacks can be prevented by securing communication channels between the Fog platform and the user through implementing authentication schemes. Based on the current state of authentication in Fog platform, Fog platforms are missing rigorous authentication and secure communication protocols as per their specification and requirements.
In a Fog platform both security and performance factors are considered in conjunction, and mechanisms such as the encryption methodologies known as fully homomorphic [ ] and Fan-Vercauteren somewhat homomorphic [ ] can be used to secure the data. These schemes consists of a hybrid of symmetric and public-key encryption algorithms, as well as other variants of attribute-based encryption. As homomorphic encryption permits normal operations without decrypting the data, the reduction in key distribution will maintain the privacy of data.
Other research work provides a similar framework to secure smart grids, regardless of Fog computing, called the Efficient and Privacy Preserving Aggregation EPPA scheme [ ]. The system performs data aggregation based on the homomorphic Paillier cryptosystem. As the homomorphic ability of encryption makes it possible for local network gateways to perform an operation on cipher-text without decryption, it reduces the authentication cost in terms of processing power while maintaining the secrecy of data.
This paper [ ] concludes that AES is a suitable encryption algorithm for a Fog platform. According to the results, encryption time was nearly the same for both smartphone and laptop using small amount of data, such as Kb, 5 Mb, and 10 Mb. Although, AES encryption is universally accepted [ ] and is feasible for Fog computing, due to low hardware specifications and smaller computations, the experiment does not compare AES with any other available encryption algorithm. In addition, the size of the encryption key plays an important role in strengthening the encryption.
Furthermore, the experiment should also have compared the performance and efficiency vector of different key sizes; , or bits. Their work lacks evidence and justification as only three sample files are used in whole experiment. Using small sample size might not provide the deep insight to whether AES is a suitable algorithm for Fog networks and storage or not. Moreover, the Fog platform consists of heterogeneous devices with different specifications and single algorithm might not be able to cover all possible scenarios.
Encryption is already an additional task for the Fog platform and also consumes large amounts of resources. The selection of encryption algorithm whether symmetric, asymmetric or hybrid should be performed in accordance with provider and infrastructure requirements. It is evident in the above sections that the recommended security solutions are individually not sufficient to protect the CIA of Fog platform.
Hence, the current security state of Fog networks do not satisfy the modern day security requirements. Broadly speaking, the literature briefly provides the solutions to data integrity, insider threat, managing resource access policy, user authentication and encryption. Any of these stated threats can allow attackers to risk the CIA of Fog network and connected devices.
One potential solution to these issues can be to reuse well-established and proven security protocols of other similar technologies. The main challenge here is to link and modify the security measures and apply them in accordance with the requirements of Fog platform.
The existing security measures have gone through rigorous testing, and using them has the potential to ensure that any Fog system satisfies necessary industrial security standards. In the light of above literature review, this section presents the security knowledge gaps that should be covered to build a reliable, applicable and trustworthy Fog platform.
Despite having large potential and number of applications, there is a lack of security solutions available for Fog system developers and designers. However, as Cloud computing and many similar technologies albeit centralised systems resemble the working mechanism of Fog computing, they can provide a deeper insight into the security threats and solutions.
Even though each Fog deployment has a different set of security requirements, applications and sensitivity, the following subsections provide a comprehensive, efficient and applicable security solutions, which are gathered and tested on various systems.
They can also be used as generic best practise guidelines while developing the Fog software, so that the security is enabled from within the platform. Recommendation: 1 The data needs to be secured before at rest in source location , during in motion through network and after at rest in destination location communication among IoT devices, the Fog network and Cloud platform.
Future challenge: 1 Added data security measures typically cause significant reduction in computational resources available for normal Fog-based operations [ ]. In addition, the cipher-text can consumes more disk space than original text and further influences the working mechanism of application and database layers. Data encryption is a widely used mechanism to protect data confidentiality.
For data at rest, the AES algorithm with bit key size or obfuscation can be used to ensure privacy, while the Secure Socket Layer SSL protocol can be used for establishing secure communication between a server and a client [ , ]. The important aspect here is to clearly distinguish between archival data and sensitive information. Encrypting archival data like public video streaming will reduce the performance of Fog system and impact upon the performance of sibling applications.
It is, therefore, essential for the designer of a Fog system to adequately assess the importance of the data and implement security measures where necessary. Recommendation: 2 Fog platforms maintained for Cache management system are prone to software cache-based side channel attacks such as exposing cryptographic keys, which may lead toward leaking sensitive information.
Future challenge: 2 Prevention of cache-based attacks is either too expensive for practical implementation or the solution only protects against a specific kind of attack. Research shows that cache interferences is the most common type of attack, whose elimination requires both hardware and software modifications [ ]. Fog systems that are used for enhancing the performance and power efficiency of other systems using advanced memory caching techniques can be probed via Cache Side Channel Attacks [ ], resulting in the exposure of sensitive data within connected systems.
The cache holds data that is frequently used and could contain personal user information. These solutions are alternative low-level implementations of a security-centric memory cache system that can better protect residing data. For new cache designs, solutions like Partition Locked cache and Random Permutation cache [ ] can relieve Fog network from cache interferences attacks. In addition, the mechanism to prevent modifications in smart meter data in the advanced metering infrastructure would be to retain collected data in Fog node for specific duration of time before release.
Even though these security solutions are expensive and difficult to implement, Fog platform developers should consider them as it is important not to rely on standard default implementations that may result in significant weaknesses. Recommendation: 3 Fog systems that are continuously handling private data e. Future challenge: 3 A Fog network is usually connected to large number of small devices. The data generated by a single device may be small, but when the streams of multiple devices are combined, the amount of overall data becomes significantly challenging to handle [ ].
Hence, filtering each network packet would instigate the necessity to increasing processing and memory capacity. Each Fog platform should implement resource efficient network monitoring mechanisms. They should be considered as an integral part of every Fog system, so that malicious activity can be identified and terminated before any real damage occurs. The fundamental underlying process comprises of scanning dynamic and large networks to mark suspicious and malicious network packets based on pre-defined rules and policies.
The network scanning process can be classified as static, dynamic or a combination of both. Scanning is typically achieved by assorting Firewalls, Anti-viruses and Intrusion Detection and Prevention Systems [ — ]. For further improvement, the network monitoring applications can start operating in distributed and intelligent manner. They can use Artificial Neural Networks ANNs and rule matching [ ] for threat detection as a large number of heterogeneous IoT devices are transmitting and processing heterogeneous data on multiple levels hypervisor, operating system, and applications.
Furthermore, due to the localised nature of Fog devices, the implementation of Virtual Private Networks VPNs can also help in isolating the network from external attacks. Recommendation: 4 Fog systems should protect themselves against both new and existing malware-based attacks, which can occur in the form of virus, trojan, rootkit, spyware and worms to avoid unwanted infection and serious damage.
Future challenge: 4 The ever increasing complexity of malware attacks, lack of modern day threats detection, possibility of more zero day vulnerabilities, and the and sparse nature of connected mobile devices presents significant protection challenges. The Fog system also requires a lightweight, cross-storage host agent and a network-based detection service to fully defend against these threats [ ].
Most Fog systems are missing appropriate malware protection schemes as they requires dedicated and continuous allocation of network and computation resources, which might not be available in every Fog platform. As many Fog systems are also deployed on smart-phones and tablets such as in BCI applications, they can become a source of malware infection [ ]. One suitable solution would be a physical malware detection device [ ] as it would use minimal Fog resources.
By increasing the Fog platform specifications, tools like BareCloud [ ] can be deployed, which can automatically detect evasive malware. Furthermore, machine learning techniques [ — ] can be applied to identify zero day attacks with higher accuracy. SONM is a decentralized, open source structure where users can buy and sell computing power while building a market profitable to both. SONM uses fog computing, which unlike existing cloud-based computational systems Amazon, Microsoft, Google, and others which use a centralized system where they utilize their own servers and computational power to achieve the same result.
Fog computing is defined in Wikipedia as:. An architecture that uses one or more collaborative multitude of end-user clients or near-user edge devices to carry out a substantial amount of storage rather than stored primarily in cloud data centers , communication rather than routed over the internet backbone , control, configuration, measurement and management rather than controlled primarily by network gateways such as those in the LTE core network.
The internet backbone can be defined as the use of strategically interconnected computer networks and core routers on the Internet. While the bigger cloud-based computing companies use their own interconnected networks, SONM uses a different approach. SONM has prepared a comprehensive Business Overview and Whitepaper, and we will try and bring the main points to you here.
The links to these resources will be below in the Resource section and we suggest reading them if you want more information before you decide to participate in the ICO, or if you want a deeper look into SONM. SONM, through its fog computing, allows for the interaction between buyers, who need computational power and the and the workers, who have excess computing power.
With the fog computing, there is no need to pre-pay for computational power, but buyers can pay on a need basis. Everyone in the world who uses the internet for business will have an option to use SONM tokens to solve their computing power issues.
Also, all internet users will be able to use SONM to receive passive income by providing their computational resources for rent. The company points out, that as mining has become hydrated, the cost of mining continues to rise on the blockchain and there are those that no longer can mine profitably.
Therefore the fog computing platform is a great opportunity for these miners to use their excess capacity and sell this capacity to buyers.