In the current face of the cyber security issues, the most looming debate points at the interplay between the society and cyber security. There has been a chain of focus on social factors that inform on the perception associated to development of security across the cyber realm. This happens in line with such common topics such as the essence of data protection, IT security infrastructures, use of the web and the cybersecurity strategy across the European nations and other regions of the world (Von Solms, and Van Niekerk, 2013; Abomhara, 2015; Razzaq et al., 2013). The criticality of the situation surrounding cyber operations requires new attention particularly putting sense of trust, agility, partnerships and cooperation into account. As pointed by Tweneboah-Koduah et al. (2017), Kritzinger and von Solms (2010), and Abomhara (2015) as while internet and recently interconnectivity through internet of things (IoT) are modelled to reflect public good but it all requires the attention of the cyberspace security, integrity and the availability of the systems. In this context, the discussion focuses on the fundamental issues which include the poor decisions people make, failure to act on time and the environmental and systemic uncertainty.
In aftermath of major cyberattacks and data breaches seen recently affecting such organizations like Sony Pictures, T-Mobil, Yahoo, JP Morgan and Target Corporation among others, concerns on cyber security and threat intensified (Jalali et al., 2019; Cimpanu, 2019; ThreatWarrior, 2019). Following hacking by suspected Russian hackers, the United States Democratic National Committee (DNC) noted that it is very critical for most of the organizations to retain the vigilance as well as act effectively with regards to protecting the infrastructure against the cyber incidents (Mcclatchy, 2017; Geller and Hendel, 2018; Shane, Sanger, &Lipton, 2016). In the year 2016, Yahoo is believed to have encountered over 1 billion accounts, which are said to have been compromised. Another incident that occurred in 2017 saw Target paying a total sum of $18.5 million in an effort to settle losses and total costs of a data breach that amounted to $292 million (Warren et al. 2018). However, it is never a matter of the financial losses alone but also increased incidents of the irreparable damage that a company has to survive with, loss of credibility and the experience of a more weakened competitive position. Attacks are anticipated to be more complex and sophisticated with more subtle vulnerabilities seen to increase the risks (Pertermann 2012). However, the main problem amid the increased risks of cyberattacks includes mapping the fundamental issues that surround the whole idea of cybersecurity and the complex structure around it.
First, organizations and the stakeholders are either making wrong decisions or are failing to act on time. For instance, as argued by Jalali et al. (2019) and Gheyas and Abdallah (2016), it is uncertain to exactly predict the cyber incidents. It should be noted that the trade-off between expected returns and the risks can effectively be resolved through computation of the risks and selecting the appropriate risk-return combinations. Then it requires rational decision makers to heavily invest in the appropriate information security for the purposes of attracting positive returns attained within the set time (Jalali et al., 2019). With such characteristic decisions put in place, it is very critical to realize details on the influence of information security risks and the probability of security incidents (Brown et al., 2015; Jalali et al. 2019). This equally makes it hard to allocate the appropriate resources while encountering more challenges in terms of measuring costs as well as benefits linked to information security investments. Essentially, lack of historical data makes it hard to develop the most appropriate risk management strategies, estimate the possible impact of the hypothetical cyber incident and conduct the most appropriate cost benefit analyses. Another fundamental issue is failure to act on time as a result of the delays and the complexity of the systems. According to Sadeghi et al. (2015) and Wang et al. (2016) systems become more complicated due to a web of the interconnected components that attract potential delays. As highlighted by Wan et al. (2017) and Sabaliauskaite and Mathur (2015), the manager’s key problem resides behind problem solving which should be event oriented and reactionary at the some point. The application of the event oriented frameworks would essentially lead to failure in comprehending the connections across the components thereby leading potential delays in a situation of cause and effect. In a situation where there are delays between application of a given decision as well as creating its effects would ultimately lead to instability, which means that the system tends to oscillate while compelling managers to make attempts of reducing the perceived time gap (Ben-Asher, and Gonzalez, 2015). Such delays can be inherent and largely accompanied by lack of absolute understanding of how to strike equilibrium in the system and how to develop the most effective decisions applicable to the system in a given time.
Other fundamental issues that may not be delinked from the incidents of data breaches and the increased cases of cyberattacks include the environmental and systemic uncertainty. At one point, Cavelty (2012) and Briggs (2008) held that the global messaging network aligned to the financial transactions is believed to have suffered a chain of cyberattacks noted on a large scale. It could not be established what forced the entire network to disconnect and discontinue the services. This incident, among many others, has attracted the subject of systemic uncertainty aligned to the cyber risk (Kopp et al. 2017). The International Monetary Fund took note of the fact systemic risk can be a common term that would be used universally but it might be hard to define as well as quantify it (Bouveret, 2018). The systemic uncertainty is almost inherent in that the impact or the consequences can be more widespread. This means that there can be an absolute breakdown of the entire system, which is more complex with multiple variables, dependencies, connections as well as interdependencies that occur through the cascading consequences.
The risk events tend to be sudden as well as unexpected thereby impeding the appropriate planning process that would impact the scale of response. However, the systemic nature of events ruins the availability, confidentiality and integrity of the critical information that touches on sensitive systems and data breaches (Jalali et al. 2018). The changing environment is equally attracting doubts in the face of the unprecedented rate of evolution that reflects the up-tempo pace of both innovation and technology. This can be noted in the growth of digitization of assets, networks, systems and data, which to increased chances of encountering the unprecedented efficiencies. In the year 2016, statistics have it that around 3.4 billion people were online. The massive growth in terms of internet connectivity may lead to increased opportunities and equally lead to explosion of applications accompanied with endless risks.
The study of cyber maneuver is inclined to the primary definition which details the application of force to significantly capture, degrade, destroy, deny, manipulate and disrupt the computing as well as information sources for the purposes of attaining a position of advantage that puts one above the competitors (Applegate, 2012; Beraud et al., 2011; Vidalis and Angelopoulou, 2013). The force is aligned to the special purpose assigned to the code and majorly meant to accomplish the objective of the attacker or defender as implemented in a given time and within a virtual location. Based on this understanding, the discussion presents a critique of the article “Deception and Maneuver Warfare Utilizing Cloud Resources” authored by Vidalis and Angelopoulou (2013). In the course of critique, the discussion points at the credentials of the authors, the chosen methods, issues associated to generalizability and state of evidence, which may be biased or attract a conflict of interest, relevance to other literatures and the time. It is also worth noting that the critique will look at how the cyber maneuver principles can underpin both the offensive and defensive operations across the modern information environment.
Stilianos Vidalis is a lecturer at what is known as the University of Staffordshire. His experience and expertise is informed by a PhD he received in Threat Assessment within the context of Information Security in the year 2004. He has continuously expressed research interest in information operations, information security, profiling, effective computer defense mechanism, digital forensics and threat assessment. On the other hand, Olga Angelopoulou is also a lecture as well as a program leader in charge of MSc Computer Forensic Investigation at what is known as the University of Derby. Her experience and expertise is informed by a PhD she got in Computing at the University of Glamorgan. She has expressed research interest in online fraud, digital forensics, and online social networking and identity theft. Both Stilianos Vidalis and Olga Angelopoulou have an informed background that is more relevant to the subject of cyber maneuver. This must have helped them to sample ideas, document them and publish their work in the year 2013, which can be regarded as period that touches on some of the current issues as far as cyber maneuver is put into consideration. They both aimed at establishing the maneuver warfare operations in the face of cloud computing resources, and how such operations can be used in information security model realized in the 21st century.
On the side of methodological contributions, it can be noted that Vidalis and Angelopoulou (2013) did not show efforts in defines of defining tools and processes considered for the research. However, it cannot be ignored that the authors put into consideration the significance of analyzing and interpreting findings from other relevant sources. This can presumably be deemed as the systematic literature review that selectively takes note of the significant areas that directly touch on the relations established in the research topic. The state of evidence considered is justifiable by drawing conclusion from a chain of sources considered by the authors. In the revisit of assets, threats as well as vulnerabilities, it is clear enough that Vidalis and Angelopoulou (2013) relied on sampled findings from Blyth and Kovacich (2006) and Stoneburner et al. (2002). The sources helped the authors to establish the intelligence channels associated or assigned to the adversaries. However, there is a significant scale of bias and prejudice in the course of extracting evidence. A chain of sources that informed on the ideas and issues addressed in the article include Vidalis and Kasmi (2007), Vidalis and Jones (2005b), Vidalis (2004) and Vidalis and Jones (2005a). All these show contributions from one of the authors, Stilianos Vidalis, which is something that highlights high chances of taking a biased stand in the course of expressing arguments.
In essence, more contributions from Stilianos Vidalis, in the article, denote scales of conflict of interest justified by unbalanced contributions from either side. Despite the notion of bias and conflict of interest, the article remains on course while taking note of the insights aligned to the interplay between deception and maneuver as far as warfare is put into consideration. The authors share perceptions regarding the modern information environment in the face of cloud services and information operations. It is also interesting to identify how the article tends to share the cyber maneuver principles and how they underpin both the defensive and offensive operations amid the modern information environment. In the context of offensive operations in the modern information environment, the Cyber maneuver principles takes note of characteristics such as been exploitative, positional and influencing. The interest of cyber maneuver in the offensive operations is to seek and secure positional advantages with respect to the state of the competitor. On the exploitative angle, cyber maneuver intends to capture information resources for the purposes of gaining the most tactical, operational and strategic competitive advantage. Vidalis and Angelopoulou (2013) equally takes note of the information operations while considering the IE domains. In this sense, Cyber maneuver, based on the arguments presented by the authors, should aid targeted deception that is likely to reinforce the expectations of the adversaries while noting the realistic timing as well as duration.
In the course of defensive operations, cyber maneuver takes note of the deceptive defense, counter attack and movement of the target defense. A closer look at deceptive defense leads to the establishment of the cyberspace analogy of ambush. Perhaps, this denotes the intent to lure the attack in engaging in actions that are likely to reveal the methodology used by the enemy. This informs on the necessary adjustments on the defense systems that take into account the techniques, procedures and tactics used by the attacker. In equal measure, Vidalis and Angelopoulou (2013) highlighted significant contributions in terms of relating deception in assuring security in the information era. More attention is given to the low intensity operations used in designing as well as developing the deception plan with the help of the cloud resources.
In the pursuit of the required levels of computer security, more attention is commonly directed at the security infrastructure. Plößl & Federrath (2008) and Dawoud et al. (2010) highlighted that more weight is given to availability of the system, integrity, and confidentiality at the same time. Adverse effects of the system can be associated to the requirements put behind the recovery time. Notably, with the enhancement of technology in terms of computer use, organizations are equally reminded of the role in addressing different security posters that need to be enhanced as well (Aradau, 2010; Radvanovsky, and McDougall, 2018). In the cybercrime landscape, as noted by (Esteves et al. (2017) and Crawley (2016) most of the attackers are more concerned with the type of the organization, the nature of business data, and the security system. Based on this preamble, the discussion puts focus on the three components of the security infrastructure of a computer system, the strategic design principles, the IT security principles and the most relevant policies that can be applied to bolster computer security.
Most of the designers, while developing a system, focus on the network security, cybersecurity and information security. First, information security keeps an eye in physical as well as digital data, which needs to be protected from any unauthorized use, access, disclosure, inspection, modification or even disruption (Muñoz-Arteaga et al., 2009; Fernandez-Buglioni, 2013). Therefore, as indicated by Stoneburner et al. (2001), developers are essentially required to create an information security program that incorporates the most relevant governing structure. As such, according to Peltier (2016) and Hu et al. (2014), the system has to ensure that it bridges the gap between information security and the organizational goals and objectives. Unlike information security, cybersecurity is commonly incorporated in the security infrastructure to essentially take care of digital data (Von Solms, and Van Niekerk, 2013). In essence, cybersecurity aims at defending the organization’s computers, network and data from possible attacks or unauthorized digital access through implementation of the needful technologies, processes and practices at the same time. Designers would commonly be worried of the IT infrastructure while striving to make it essentially hard or rather impossible to tamper with or unauthorized access to the computer system and data. In the scope of cybersecurity, based on Breda et al. (2017) and Hatfield (2018) perspective, more focus is given to social engineering in which threat actors are lured in giving the most sensitive information. According to Mouton et al. (2016) and Mouton et al. (2014) some of the common social engineering attacks include phishing, pretexting, baiting and quid pro quo. The last component of the security infrastructure that informs on the necessary principles is the network security (Bodeau and Graubart 2017). This components aids protection of data that is essentially sent via the devices connected to a common network. The necessary measures and principles ensure that the information is neither intercepted nor changed before reaching the receiving end. Role of the network security includes protecting the IT infrastructure from cyber threats, which are not limited to Trojan horses, worms, viruses, zero day attacks, hacker attacks, Spyware and even denial of service attacks (Stallings et al. 2012). Based on these three components of security infrastructure, the discussion is first directed towards the strategic design principles.
On the basis of the strategic design principles, these could be associated to the programmatic security plan while investing in the engineering strategy. As argued by Karvetski and Lambert (2012), this implies that the security plan should essentially identify the relevant principles which can easily be tailored towards a given system, program and SOS. The first principle is aligned to the focus given to common critical assets. In this principle, both the programmatic and organizational resources can be applied in a more economical way that would serve the greatest benefit. According to Merrell et al. (2010) and Karvetski and Lambert (2012), a critical focus on assets gives more attention to operational resilience, safety analysis, continuity of the operations planning and contingency planning. Central analysis can be done to the program protection plan while taking note of the Failure Modes, Effects and Criticality Analysis, Mission Threat Analysis and the Functional Dependency Network Analysis (Garvey, and Pinto, 2009; Merrell et al., 2010; Gullo, 2012). This principle ensures that the shared data repositories, shared devices, and common infrastructures are served with high priority while taking note of the cyber resiliency techniques. The second principle is the support for agility as well as architect for adaptability (Salim 2014). This principle is informed by the goal of evolve cyber resiliency that recognizes both adaptability and agility as one way of responding towards the technical as well as operational environment. Agility demands room for reallocation of resource and components within the security infrastructure can be repurposed and reused. On the other hand, Cheng et al. (2009) highlighted that adaptability demands that any design, model or even architecture should provide room for changes as far as the mission threads, the threat model and technologies are put into account. Some of the strategies that can be adopted to address this principle include controlled interfaces, modularity, externalization of riles, plug-and-play and configuration of data (Stallings et al. 2012).
The third principle is reduction of the attack surface. Notably, according to Szefer et al. (2011) and Manadhata & Wing (2011), large attack surface makes it hard to defend because it calls for more effort in analyzing, monitoring and responding to the evident anomalies. However, when attack surfaces are reduced, then the adversaries are forced to concentrate most of their efforts on reduced set of resources, environments and locations which can easily be defended and monitored at the same time. The fourth principle entails assumption of the compromised resources (Rasouli et al., 2014; Whitman, and Mattord, 2011). It is worth noting that system architectures would constantly treat most of the resources as non-malicious. Rasouli et al. (2014) highlight the assumption is common in the IoT architectures and the cyber physical systems. However, most of the systems that have components that range from the chips to the most significant software modules are likely to be compromised for a long period of time without any detection. Based on this case, the idea of some of the system resources being compromised is more prudent and takes the standpoint of security. This implies that compromised resources should not be trusted. The call for subsequent system analysis is likely to reduce the consequences of possible compromise in terms of the degree and the duration of the adversary-caused disruption. The last principle entails the expectation of evolving adversaries (Bodeau and Graubart 2017). Notably, the most advanced cyber adversaries show a significant tendency in investing efforts, time and intelligence in either working on the new TTPs or even improve on the prevailing ones. Most of the adversaries would evolve in an attempt of responding towards opportunities availed through new technologies in the market. Therefore, this implies that the set systems as well as missions are required to be resilient to the most unexpected attacks.
Apart from the design principles, systems should equally be attracted to the IT security principles. The first principle under the IT focus includes the establishment of sound security policy, which forms the security foundation. The security policy is more important while designing the information system. The policy recognizes the security goals, procedures, standards and even controls incorporated in the IT security architecture. The policy is required to define the security related roles, perceived roles and the perceived threats. The second principle demands that security should be treated as an integral part of the entire system design (Salim 2014). In most cases, it is costly and difficult to essentially implement the necessary security measures after developing the entire system. Based on this fact, it is highly recommended that security requirements need to be incorporated in the system through what is referred to as system life cycle process. The third principle claims the necessary delineation of the physical as well as logical security boundaries, which are governed by the most relevant security policies. Boundaries are commonly defined by the people, information technology and information. Additional factors include the nature of the security policies among others. The fourth principle demands that risks should be reduced to acceptable levels. Commonly, risk avoidance was one of the security goals for the IT infrastructure (Stallings et al. 2012). However, there is more attention which is directed towards the cost benefit analyzes for each of the proposed control. The fifth principle indicates that external systems are constantly insecure, which also agrees with the mentioned principle under the design requirement from the system perspective (Salim 2014). The external domain is never under the system control and should be regarded insecure. The system engineers, IT specialists and even the architects ought to be careful in designing security measures for the external systems, which ought to be treated differently from the trusted internal system. The sixth principle requires one to identify significant trade-offs across the increased costs, reduction of risks and a decline of the operational effectiveness.
For the purposes of meeting the security requirements, security practitioners, system designers and even the architect are called upon to both identify as well as address the competing operational needs. Based on the principles, it is more appropriate to modify as well as adjust security goals based on the operational requirements. Through addressing the trade-offs, policy and decision makers stand a chance of attaining better and more effective systems. The ultimate goal demands that the tailored security measures need to meet the organizational security goals. It is worth noting that most of IT security needs are never uniform and it is commonly a fundamental issue to address security related issues that carry with them the negative impacts. At the same time, security teams and system designers need to evaluate the trust levels before linking the system to an external network or even the internal sub-domains. Notably, the system equally demands protection of information when it is being processed, or it is in storage or in transit.
The preparation of the report on policy framework was done in the light of an understanding of the UK Contractual laws and the principles of public procurement. The two platforms are known for dictating the freedom to provide significant services to the government and the freedom of establishment. In the face of the UK Government contract that the company won, it is important to note that the organization is ready to observe mutual recognition, equality of treatment, transparency and the principle of proportionality. In the course of entering a contract with the UK Government, the report has established policies and standards felt appropriate for the desirable IT infrastructure. In an effort to meet the necessary standards, it was more effective for the company to develop the four-pillar policy, which showcases the robustness of the system that would be served. The four-pillar policy gives four significant and characteristic policies that define the IT infrastructure in terms of the functional use and suitability, which also constitute the rationale behind development of this system. Part of the four-pillar policy is the information security policy, which safeguards the IT resources, which are equally treated as critical assets. Such an attention directed at these resources considers the functional use and the need to introduce security controls that would play part in terms of ensuring continuity of service. At this point, the managerial infrastructure needs to take charge of the multiple points of access and the administrative units and departments assigned critical responsibilities. The second component is the accountability policy, which defines the responsibilities shared across the communication and computing systems as far as the components of the infrastructural support are put into consideration. The components include the workstation, computers and the servers. More attention is given to the networked services where assignment of privileges remains critical. The third component of the four-pillar policy is the information management policy. Attention is essentially given to the information lifecycle and the levels of protection that need to be identified within the IT infrastructure. This comes in the light of classification levels needed for different users of the system. This component pays more attention to the security analytics in which the system maintains its functional use, and still benefits the users. In this case, security standards in line with the UK requirements were observed as far as the administrative role is put into consideration. The last component of the four-pillar policy is the access control policy, which narrows down to the availability and confidentiality of the IT resources. System checks on the data centers, wiring closets and even servers are needful while trying to comply with the judicial baselines and the access privileges defined by the system. Other areas that were deemed useful but not critical include the acceptable use policy and the remote access policy.
First, the acceptable use policy is relevant because it is a pointer to the integrity and the necessary security measures. However, it may largely benefit the user end more than the system itself. Most of these users need to be defined by the government itself. Secondly, remote access policy may call for more infrastructural support which may not solely be defined by the company alone, but by all stakeholders deemed relevant by the government. Apart from the four-pillar policy, the company has equally paid attention to the high priority controls that would prompt system checks from time to time. The controls include information security control, access control and the environmental and physical security control. A combination of these three controls provides a wholesome description of the robustness of the internal systems and the external layout that meets the functional need and the objectives of the system. #
Abomhara, M., 2015. Cyber security and the internet of things: vulnerabilities, threats, intruders and attacks. Journal of Cyber Security and Mobility, 4(1), pp.65-88.
Abomhara, M., 2015. Cyber security and the internet of things: vulnerabilities, threats, intruders and attacks. Journal of Cyber Security and Mobility, 4(1), pp.65-88.
Bouveret, A., 2018. Cyber risk for the financial sector: a framework for quantitative assessment. International Monetary Fund.
Brown, S., Gommers, J. and Serrano, O., 2015, October. From cyber security information sharing to threat management. In Proceedings of the 2nd ACM workshop on information sharing and collaborative security (pp. 43-49). ACM.
Cavelty, M.D., 2012, June. The militarisation of cyberspace: Why less may be better. In 2012 4th International Conference on Cyber Conflict (CYCON 2012) (pp. 1-13). IEEE.
Ding, D., Han, Q.L., Xiang, Y., Ge, X. and Zhang, X.M., 2018. A survey on security control and attack detection for industrial cyber-physical systems. Neurocomputing, 275, pp.1674-1683.
Gheyas, I.A. and Abdallah, A.E., 2016. Detection and prediction of insider threats to cyber security: a systematic literature review and meta-analysis. Big Data Analytics, 1(1), p.6.
Jalali, M.S. and Kaiser, J.P., 2018. Cybersecurity in hospitals: a systematic, organizational perspective. Journal of medical Internet research, 20(5), p.e10059.
Jalali, M.S., Siegel, M. and Madnick, S., 2019. Decision-making and biases in cybersecurity capability development: Evidence from a simulation game experiment. The Journal of Strategic Information Systems, 28(1), pp.66-82.
Kopp, E., Kaffenberger, L. and Jenkinson, N., 2017. Cyber risk, market failures, and financial stability. International Monetary Fund.
Kritzinger, E. and von Solms, S.H., 2010. Cyber security for home users: A new way of protection through awareness enforcement. Computers & Security, 29(8), pp.840-847.
Pertermann, K. ed., 2012. Challenges in cybersecurity: risks, strategies, and confidence-building; international conference. Institut für Friedensforschung und Sicherheitspolitik an der Universität Hamburg.
Razzaq, A., Hur, A., Ahmad, H.F. and Masood, M., 2013, March. Cyber security: Threats, reasons, challenges, methodologies and state of the art solutions for industrial applications. In 2013 IEEE Eleventh International Symposium on Autonomous Decentralized Systems (ISADS) (pp. 1-6). IEEE.
Sadeghi, A.R., Wachsmann, C. and Waidner, M., 2015, June. Security and privacy challenges in industrial internet of things. In 2015 52nd ACM/EDAC/IEEE Design Automation Conference (DAC) (pp. 1-6). IEEE.
Sajid, A., Abbas, H. and Saleem, K., 2016. Cloud-assisted IoT-based SCADA systems security: A review of the state of the art and future challenges. IEEE Access, 4, pp.1375-1384.
Tweneboah-Koduah, S., Skouby, K.E. and Tadayoni, R., 2017. Cyber security threats to IoT applications and service domains. Wireless Personal Communications, 95(1), pp.169-185.
Von Solms, R. and Van Niekerk, J., 2013. From information security to cyber security. computers & security, 38, pp.97-102.
Wan, Y., Cao, J., Chen, G. and Huang, W., 2017. Distributed observer-based cyber-security control of complex dynamical networks. IEEE Transactions on Circuits and Systems I: Regular Papers, 64(11), pp.2966-2975.
Wang, D., Wang, Z., Shen, B., Alsaadi, F.E. and Hayat, T., 2016. Recent advances on filtering and control for cyber-physical systems under security and resource constraints. Journal of the Franklin Institute, 353(11), pp.2451-2466.
Warren, P., Kaivanto, K. and Prince, D., 2018. Could a cyber attack cause a systemic impact in the financial sector?. Bank of England Quarterly Bulletin, p.Q4.
Beraud, P., Cruz, A., Hassell, S. and Meadows, S., 2011, November. Using cyber maneuver to improve network resiliency. In 2011-MILCOM 2011 Military Communications Conference (pp. 1121-1126). IEEE.
Beraud, P., Cruz, A., Hassell, S. and Meadows, S., 2011, November. Using cyber maneuver to improve network resiliency. In 2011-MILCOM 2011 Military Communications Conference (pp. 1121-1126). IEEE.
Blyth, A. and Kovacich, L. G., 2006. Information assurance security in the information environment (2nd ed.). London, England: Springer.
Chen, J.Q., 2014. A framework for cybersecurity strategy formation. International Journal of Cyber Warfare and Terrorism (IJCWT), 4(3), pp.1-10.
Mittal, S., 2015. Perspectives in Cyber Security, the future of cyber malware. Perspectives in Cyber Security, the Future of Cyber Malware (October 1, 2015). Indian Journal of Criminology (ISSN 0974–7249), 41(1), pp.210-227.
Vidalis, S. and Angelopoulou, O., 2013. Deception and maneuver warfare utilizing cloud resources. Information Security Journal: A Global Perspective, 22(4), pp.151-158.
Vidalis, S. and Jones, A., 2005b. Threat agents: What InfoSec officers need to know. The Mediterranean Journal of Computers and Networks, 1(2), 97–110.
Vidalis, S., 2004. Critical discussion of risk and threat analysis methods and methodologies. Report Number: CS-04-03. Wales, UK:: University of Glamorgan.
Plößl, K. and Federrath, H., 2008. A privacy aware and efficient security infrastructure for vehicular ad hoc networks. Computer Standards & Interfaces, 30(6), pp.390-397.
Huang, K., Siegel, M., Madnick, S., Li, X. and Feng, Z., 2016, December. Diversity or concentration? Hackers’ strategy for working across multiple bug bounty programs. In Proceedings of the IEEE Symposium on Security and Privacy (Vol. 2).
Muñoz-Arteaga, J., González, R.M., Martin, M.V., Vanderdonckt, J. and Álvarez-Rodríguez, F., 2009. A methodology for designing information security feedback based on User Interface Patterns. Advances in Engineering Software, 40(12), pp.1231-1241.
Hu, Q., Dinev, T., Hart, P. and Cooke, D., 2012. Managing employee compliance with information security policies: The critical role of top management and organizational culture. Decision Sciences, 43(4), pp.615-660.
Peltier, T.R., 2016. Information Security Policies, Procedures, and Standards: guidelines for effective information security management. Auerbach Publications.
Mouton, F., Malan, M.M., Leenen, L. and Venter, H.S., 2014, August. Social engineering attack framework. In 2014 Information Security for South Africa (pp. 1-9). IEEE.
Karvetski, C.W. and Lambert, J.H., 2012. Evaluating deep uncertainties in strategic priority‐setting with an application to facility energy investments. Systems Engineering, 15(4), pp.483-493.
Cheng, B.H., de Lemos, R., Giese, H., Inverardi, P., Magee, J., Andersson, J., Becker, B., Bencomo, N., Brun, Y., Cukic, B. and Serugendo, G.D.M., 2009. Software engineering for self-adaptive systems: A research roadmap. In Software engineering for self-adaptive systems (pp. 1-26). Springer, Berlin, Heidelberg.
Whitman, M.E. and Mattord, H.J., 2011. Principles of information security. Cengage Learning.
Stoneburner, G., Hayden, C. and Feringa, A., 2001. Engineering principles for information technology security (a baseline for achieving security). Booz-Allen and Hamilton Inc Mclean VA.
Bodeau, D.J. and Graubart, R.D., 2017. Cyber resiliency design principles: selective use throughout the lifecycle and in conjunction with related disciplines. McClean (VA): The MITRE Corporation, pp.2017-0103.
Garvey, P.R. and Pinto, C.A., 2009, June. Introduction to functional dependency network analysis. In The MITRE Corporation and Old Dominion, Second International Symposium on Engineering Systems, MIT, Cambridge, Massachusetts (Vol. 5).
Dawoud, W., Takouna, I. and Meinel, C., 2010, March. Infrastructure as a service security: Challenges and solutions. In 2010 the 7th International Conference on Informatics and Systems (INFOS) (pp. 1-8). IEEE.
Merrell, S.A., Moore, A.P. and Stevens, J.F., 2010, November. Goal-based assessment for the cybersecurity of critical infrastructure. In 2010 IEEE International Conference on Technologies for Homeland Security (HST) (pp. 84-88). IEEE.
Szefer, J., Keller, E., Lee, R.B. and Rexford, J., 2011, October. Eliminating the hypervisor attack surface for a more secure cloud. In Proceedings of the 18th ACM conference on Computer and communications security (pp. 401-412). ACM.
Rasouli, M., Miehling, E. and Teneketzis, D., 2014, November. A supervisory control approach to dynamic cyber-security. In International Conference on Decision and Game Theory for Security (pp. 99-117). Springer, Cham.
Breda, F., Barbosa, H. and Morais, T., 2017, March. Social engineering and cyber security. In Proceedings of the International Conference on Technology, Education and Development, Valencia, Spain (pp. 6-8).
Academic services materialise with the utmost challenges when it comes to solving the writing. As it comprises invaluable time with significant searches, this is the main reason why individuals look for the Assignment Help team to get done with their tasks easily. This platform works as a lifesaver for those who lack knowledge in evaluating the research study, infusing with our Dissertation Help writers outlooks the need to frame the writing with adequate sources easily and fluently. Be the augment is standardised for any by emphasising the study based on relative approaches with the Thesis Help, the group navigates the process smoothly. Hence, the writers of the Essay Help team offer significant guidance on formatting the research questions with relevant argumentation that eases the research quickly and efficiently.
DISCLAIMER : The assignment help samples available on website are for review and are representative of the exceptional work provided by our assignment writers. These samples are intended to highlight and demonstrate the high level of proficiency and expertise exhibited by our assignment writers in crafting quality assignments. Feel free to use our assignment samples as a guiding resource to enhance your learning.