Can someone help me with this paper:Assignment : Tech review 2.docxReference Tech Review Part 1 : Data%20Analytics.docxTechnology Review #2: Emerging Application of Technology in a Critical Infrastructure
This paper is the second of two Technology Reviews that you will research and write for this course. For
this paper, you must choose an emerging application of technology which is suitable for future use in
systems, hardware, or software which are used to operate or support a critical infrastructure.
The technology review papers will be used to prepare your technology selection paper for the Analysis
of Alternatives exercise later in the course. Your audience for these papers is a group of senior
executives who will be meeting to decide which emerging technologies or emerging applications of
infrastructure technologies should be selected for one or more security-focused, Internal Research &
Development projects during the next budget cycle. Each of these executives is responsible for a
business area that will (a) develop systems and services which incorporate the emerging technologies or
(b) depend upon such systems and services to support their organization’s operation of portions of the
identified critical infrastructure.
For this paper, you will perform the first three stages of the technology scan:
Technology scanning is an evaluation model that is used when you need to develop a list of
candidate technology solutions. A technology scan can also be used when you need to obtain
information about the latest advancements in security products and technologies.
The scoping phase of this technology scan has already been performed. For this paper, your scope is late
stage1 and stages 2 through 5 (as shown in the figure below).
Image Source: http://www.atp.nist.gov/eao/gcr02-841/chapt2.htm
Your scope is further restricted to technologies which are used in the computers, digital devices, and
other electronic / electrical technologies (this includes networks and network infrastructures) which will
be deployed or used in a critical infrastructure. For definitions of critical infrastructures, see
To begin, select a technology which is in the basic research or proof of concept / invention (stage 1 and
early stage 2 in the diagram above) and which meets the scoping restriction. You may use news articles,
press releases, and government or company Web sites to help you find an appropriate technology.
(Remember to cite these sources in your paper.)
Suggested technologies include:
Autonomous Vehicles (ground, sea, or air): Transportation Systems Sector
Crypto Currencies: Financial Services Sector (DO NOT CHOOSE Bitcoins)
Deep Space Communication Networks: Communications Sector
Implantable Medical Devices: Healthcare and Public Health Sector
Precision Agriculture (integrated systems using satellite imagery, GPS, Sensors, Robots):
Food & Agriculture Sector
Robot inspectors for physical infrastructures (buildings, roads, railways, pipelines, etc.):
Smart Grid (also called Advanced Metering Infrastructure): Energy Sector (DO NOT CHOOSE
Wearable Sensors for Hazardous Materials Detection (e.g. CBRNE): Emergency Services
You are encouraged to look for and investigate additional appropriate technologies before deciding
upon your technology choice for this assignment.
If you decide to research a technology that is not on the suggested technologies list (see above), you
must first request and receive your instructor’s permission. Your instructor may require that you do
preliminary library searches for research papers and technical papers to prove that you can find a
sufficient number of resources to complete the assignment.
Survey of the Professional Literature
During your survey of the professional literature, you will identify 10 research papers or technical papers
which provide technical information about your selected technology (see selection requirements for
each paper). These papers must be dated 2011, 2012, 2013, 2014, or 2015 (five year window).
Allowable sources for research papers / technical papers are: (a) professional journals, (b) conference
proceedings, (c) dissertations or theses, and (d) technical magazines (published by either the ACM or
IEEE). If an article from one of the above sources does not have a reference list containing at least 3
references you may use it in your review paper but it WILL NOT COUNT towards the “10 research or
technical papers” requirement.
The requirement to “survey the professional literature” must be met by using research
papers/publications and technical papers which are available from the following UMUC online library
databases: ACM Digital Library, Dissertations & Theses (Pro Quest), IEEE Computer Society Digital
Library, and Science Direct.
In this stage, you will evaluate and report upon the information found during your survey of the
professional literature. Read the abstract, introduction section, and closing sections for each of the
sources identified in your survey of the professional literature. From this information, develop a
summary of the technology that includes: (a) a description of technology and (b) planned uses of the
technology (products, services, etc.). IMPORTANT: your technology review must incorporate information
from each of your 10 “papers” from the professional literature.
Note: You may use other sources in addition to the papers which comprise your survey of the
Next, brainstorm the security implications of this technology (if these are not specifically discussed by
your sources). You should consider use of the technology to improve cybersecurity and uses which will
negatively impact the security posture of an organization or the security of individual consumers or
users of the product. It is very important that you consider BOTH SIDES OF THIS ISSUE.
Note: Remember that the security posture of a system or product is framed in terms of risk,
threats, vulnerabilities, etc. Improvements to the security posture (positive security
implications) will result in reduced risk, increased resistance to threats or attacks, and decreased
vulnerability. Negative impacts on the security posture will result in increased risk, decreased
resistance to threats / attacks, and increased vulnerability (weakness).
Write down your thoughts and ideas about the security implications of this technology using (a) the Five
Pillars of Information Assurance and/or (b) the Five Pillars of Information Security. For your paper, you
do not need to include all ten “pillars” but you should address a minimum of three. If you are targeting
an “A” on your paper, address at least five of the pillars. (See Technology Review #1 Detailed Project
Description for definitions of the pillars and references.)
You should provide specific examples using characteristics and/or applications of the technology, e.g. an
emerging nano technology which can be used to tag “genuine” parts with an identification code. Such a
technology may be chosen to replace etched serial numbers or bar codes because the new tags will
decrease the probability that substitute or non-genuine parts will enter the supply chain undetected.
This will, in turn, decrease the risk of substitution which then decreases the probability of loss of
availability caused by non-genuine parts. Decreasing the probability of a negative event will decrease
the risk associated with that event.
WRITING YOUR EVALUATION
Your paper must provide the reader with an overview of the technology followed by information about
the potential security risks and/or benefits of its use (the security posture ).You MUST use information
paraphrased from the papers found during your Survey of the Professional Literature (with appropriate
Your Technology Review papers should be at least three pages in length but no more than five pages
(excluding the title page and references page).
Your papers must comply with the formatting guidance provided by your instructor. All papers in this
course must also comply with APA Style for references and citations.
You are expected to write grammatically correct English in every assignment that you submit for
grading. Do not turn in any work without (a) using spell check, (b) using grammar check, (c) verifying that
your punctuation is correct and (d) reviewing your work for correct word usage and correctly structured
sentences and paragraphs. Together, these items constitute the professionalism category in the
assignment grading rubrics (worth 20% of the assignment’s grade).
Grading: Consult the grading rubric for specific content and formatting requirements for this
APA Formatting: See the resources posted under Content > Course Documents > APA Resources.
Running Header: Data Analytics
Massive data advancements speak to an open development that market-driving organizations will
use to drive upper hand. Seventy-nine percent of business “decision-makers” trust that enormous
data will support revenue. The estimation of extensive data lies in the bits of knowledge that
organizations can draw from it as opposed to in the data itself. There are down to earth issues to
succeed. Specifically, there will be a fight for ability, driven by the deficiency of individuals with
the specialized aptitudes to create bits of knowledge. This requires individuals who can
distinguish the right business questions that can be settled by examination. While enormous data
offers open doors, there are additionally genuine dangers, including the lawful and administrative
perils identifying with issues, for example, data protection that can bring about breach of trust.
Seventy percent of consumers say that they are “perpetually discontent” for organizations to
share individual data while 49% say that they will be less ready to share personal data throughout
the following five years. To secure topline esteem from huge data, endeavors require an allencompassing and key arrangement for distinguishing the open doors, defeating the obstacles
and overseeing dangers.
Individuals relate huge data analytics with business sector segmentation and promotion focusing
on; however you can likewise utilize it to turbocharge your software development process. With
conventional development models, development cycles and comparing test cycles were entirely
long. Step by step, manual testing created amounts of data that could be made do with social
databases. In any case, as deft strategies kept on penetrating each part of software development,
cycle times abbreviated, with everything tending towards the nonstop: integration, testing,
delivery, et cetera. These constant procedures create data that, as indicated by Gartner, meets the
meaning of huge data. The greater part of that is “dark data” that never gets examined or put to
utilize. For instance, data analysis may help you strike the balance between utilizing exorbitant
and thorough testing to lower danger with better code scope versus using a more engaged and
practical testing process for deft application delivery.
There are a few sources of data you can gather as an application develops from configuration to
creation. Both necessities and the configuration can change at various phases of the application
lifecycle, so the prerequisites, plan, and changes to every one of them constitute sources of data
that you ought to catch. You can utilize this data to associate an outline with the exertion
required to code it. That is more data for your analysis. You can use that data to relate coding
exertion with the quantity of deformities. Every time the cycle rehashes, the development group
produces new data. This is the point where organizations apply customary enormous data
analytics to dissect client conduct. However you can utilize all the data made by fashioners,
designers, and QA groups before the application is ever discharged to drive your particular
choice making. The data arrives for the taking. Time and again, nonetheless, that data is lost to
the dark scopes of a database with no analysis.
To utilize dark data to locate the right balance between comprehensive versus lithe testing to
improve application execution. One has to dissect execution data over various forms to perceive
how it associated with different territories of the code. This would involve testing of those
regions from a comprehensive relapse testing process during the evening to continuous tests keep
running on every code confer. Along these lines, we could recognize and address execution
bottlenecks sooner in the development handle and achieve our objective of shortening the
general development cycle. Enormous data analysis is a problematic open door you can use to
reevaluate how you function over the application lifecycle, and it can be connected to each phase
of the software delivery pipeline. Everybody included in the process produces data you can
utilize. Whether its designers are altering bugs or QA engineers reporting them; you can use this
data to help you settle on better business choices. Later on, shrewd frameworks might even work
self-governing, gaining from chronicled choices and proposing advancements in light of
authentic data. One could envision a persistent integration server that could associate tests with
the code, just running tests applicable to code that has changed.
You can change applying so as to everything huge data analysis to the abundance of dark data in
the development life cycle. The way development groups have worked for as long as 20 years in
a general sense changes when you consolidate the force of machine figuring out how to the
goldmine of data readily available.
Whether you take a gander at construct logs, dissect source code, assess deformity histories, or
evaluate defenselessness designs, you can open another part in application delivery energized by
huge data analytics. Potential engineers will check in code, and with no manual intercession the
framework will quickly execute just the relevant tests. Code changes will be consequently hailed
as high hazard given history and the applicable business effect of the progressions, in the process
activating extra relapse tests for execution. Venture into the future, where machine learning and
huge data analytics help you fabricate and convey software speedier and superior to anything you
ever could some time recently.
Today, many still question if enormous data and analytics are worth the massive investment of
cash, assets, and time for enterprises also, their initiative. The answer to this is that data analytics
can tremendously affect our money related performance.” In marketplace-related territories,
respondents said that the most imperative utilization of data analytics was in “recognizing
approaches to expand sales.” (18 percent), took after nearly by two different ranges that—if
aced—can build sales: “understanding client conduct” (17 percent) and “focusing on item and
administration offerings to particular clients” (17 percent). Just as huge was the utilization of
analytics in “distinguishing advancement and investment opportunities,” as proposed by 17
percent of respondents. Apparently, most organizations actualizing analytics to support or
optimize their marketing and sales performance are applying the analytics capacities to various
client driven projects and processes. “We are attempting to be an experiential retailer, and that
implies widening our product offering in particular classes and turning out to be more
investigative about foreseeing what offers will work, and when.”
Budgetary operations have for some time been data-driven. However, the accessibility of huge
data and the development of data analytics capacities have further elevated its significance.
These are doubtlessly the reasons that the territory regularly found to put resources into
analytics, at 79 percent, is fund.
Additionally, around 18 percent of enterprises studied report that the Chief Financial Officer is
the person inside of the association principally in charge of analytics, making the CFO the third
most normal analytics manager. It makes sense that if money is willing to put resources into
analytics, there is ROI to be. “Today, it’s not the amount of work that can be(analytics) done
inside of the organization, yet what if we concentrate on that gives us the best ROI,” noticed a
customer analytics master for a worldwide innovation organization. CFOs and their groups
frequently (at a little more than 24 percent) use data analytics in “gauging financial
performance,” while another 23.5 percent use analytics for “comprehension the drivers of
economic performance.” In other exploration, Deloitte Analytics has discovered numerous
instances of associations moving all analytics staff members into a brought together, sharedadministrations capacity reporting straightforwardly to back—which serves as Switzerland—a
nonpartisan gathering that can supply the whole association without the political plots that can
upset a more decentralized methodology. Where preferable to have that happens over in fund?”
Data analytics tools are progressively being utilized to bolster the top line, as customer planning
ranges use analytics assets for an assortment of wage-related activities. Around 27 percent of
respondents trust that analytics is most critical for expanding deals to new and existing
customers, trailed by 17 percent who trust it is most essential for activities to lessen customer
beat and build unwavering.
Big Data and Analytics. (n.d.). Online Marketing and Big Data Exploration.
Big Data and Predictive Analytics Are Now Easily Accessible to All Marketers. (2015). Predictive Marketing
Easy Ways Every Marketer Can Use Customer Analytics and Big Data, 1-22.
Big Data and Predictive Analytics Are Now Easily Accessible to All Marketers. (2015). Predictive Marketing
Easy Ways Every Marketer Can Use Customer Analytics and Big Data, 1-22.
Big Data Sources. (2015). Big Data Analytics Turning Big Data into Big Money, 37-46.
Loshin, D. (2013). Business Problems Suited to Big Data Analytics. Big Data Analytics, 11-19
Mohanty, S., Jagadeesh, M., & Srivatsa, H. (2013). Application Architectures for Big Data and Analytics. Big
Data Imperatives, 107-154.
The Analytics Advantage | Deloitte | Deloitte Analytics Services | Article | Insights. (n.d.). Retrieved
February 07, 2016, from http://www2.deloitte.com/global/en/pages/deloitte-analytics/articles/theanalytics-advantage.html
The Process of Building a Cognitive Application. (2015). Hurwitz/Cognitive Cognitive Computing and Big
Data Analytics, 157-173.
Wang, P., Ali, A., Kelly, W., & Zhang, J. (2014). InVideo—a novel big data analytics tool for video data
analytics and its use in enhancing interactions in cybersecurity online education. Advances in
Communication Technology and Application.
Why Big Data Matters. (2015). Big Data Analytics Turning Big Data into Big Money, 11-19.
Purchase answer to see full
Why Choose Us
- 100% non-plagiarized Papers
- 24/7 /365 Service Available
- Affordable Prices
- Any Paper, Urgency, and Subject
- Will complete your papers in 6 hours
- On-time Delivery
- Money-back and Privacy guarantees
- Unlimited Amendments upon request
- Satisfaction guarantee
How it Works
- Click on the “Place Order” tab at the top menu or “Order Now” icon at the bottom and a new page will appear with an order form to be filled.
- Fill in your paper’s requirements in the "PAPER DETAILS" section.
- Fill in your paper’s academic level, deadline, and the required number of pages from the drop-down menus.
- Click “CREATE ACCOUNT & SIGN IN” to enter your registration details and get an account with us for record-keeping and then, click on “PROCEED TO CHECKOUT” at the bottom of the page.
- From there, the payment sections will show, follow the guided payment process and your order will be available for our writing team to work on it.