Latest News for 4TC
We have loads to say!
We have loads to say!
In the previous article, we looked at the modern office and came to the conclusion that technology has radically changed both our work environments and our personal lives. We also came to the conclusion that simply purchasing technology is insufficient, as you must choose technology that is strategically suited to how you and your team operate. Furthermore, you need to make confident that the tech’s support is of the top calibre.
The duties of your support provider are far more varied than they used to be, and IT assistance is no longer just support.
We’ll highlight a few things to watch out for while you browse the websites of possible providers later on in this post.
If your supplier is not proactive, you have little possibility of using the most recent technology. If the tools you are utilising right now aren’t assisting in the expansion of your company, you shouldn’t be forced to keep using them. Proactivity can be revolutionary for the progress of your organisation, as well as from a monitoring perspective. Please don’t be misled — just because an IT company advertises 24/7 monitoring on their website doesn’t indicate that they are proactive. Being proactive takes time. The service provider who will best serve your needs will get to know you, your staff, and what makes your company go. Look at the pledges businesses make when you browse the websites of potential providers; if you’re not sure if they offer this consultancy as part of their SLA, contact them and ask.
In the contemporary environment, remote help ought to be taken seriously. Traditionally, you would phone your provider to report a problem, and they would attempt to explain it to you and, if necessary, send a professional to your location to attempt a physical fix. This is no longer essential because many service providers now employ software that enables you to submit a request, which is quickly followed by a professional entering your system – provided you permit them access through a remote session – and resolving the issue remotely. Hopefully, doing so will enable you to learn what to do if the issue recurs.
Finding a service with experience and expertise working with businesses in your industry area is the most obvious recommendation. Consider this: Can someone give you the finest advice on what to adopt in your company if they have no experience operating a profitable enterprise in your industry?
Watch out for references to a strategic collaboration, a tailored strategy, or strategic alignment; all have the same meaning and, if made, are positive indicators. It is up to your supplier to make the effort to forge a strategic partnership with you; this is made possible via a combination of initiative and knowledge. Your provider should approach you with an IT strategy that is future-proof and uses the capabilities of current technology. Everything should be geared toward maximising efficiency so that you, your team, and eventually your organisation can continue to advance. As a result, there are a lot of things to check out on a potential provider’s website. These articles should make it clear to you what is expected of an IT team today and how, with the correct assistance, technology may actually work in your favour. This can be difficult, so if you still need clarification or assistance, please don’t hesitate to contact us.
If you are unsure about any of these areas then please don’t hesitate to reach out to us. Our experts here at 4TC can help you with your IT Support or any other technical queries.
The modern world has changed as a result of revolutionary technology. Technology is continuously changing, and it is thanks to this progress that both our personal and professional life are now easier and much more connected than they ever were. Businesses all around the world are gradually integrating technology into every aspect of their operations, some without even realising it. Whether it’s a small family-run company or a huge multinational behemoth, tech is having an impact.
The use of the proper tools and, more importantly, the appropriate amount of support for those tools’ use, could not be more crucial today, as technology is perhaps the most vital component of a successful business.
The influx of business technology has led to an explosion in IT support companies, and aspiring business owners have seen an opportunity. The majority of these companies make outrageous claims like, “We give the best support,” “We approach your IT landscape proactively,” or “We cater our service to you.” How can you even define “the best” IT support? How is it measured? The answer unquestionably depends on the goals you have for IT.
One widespread misunderstanding about IT support is that it is just support. Contrary to the name, it isn’t any longer only support. Yes, there was a period when that was what it meant, but it was only because technology hadn’t advanced enough to meet everyone’s requirements. There was no chance of making future plans. The IT was functional, but that was about the best you could hope for as you had to adapt your business to fit within its constraints rather than the other way around. Fortunately, we can now tailor our IT support plans to each client rather than forcing them to adjust to IT’s requirements.
Modern IT support should do more; it should be tailored to your business and change to meet your demands. The IT professionals who handle your account ought to be concerned; they ought to exert every effort to join your team and work hard to assist you in achieving your business objectives with the aid of cutting-edge technology.
An IT organisation needs to take a proactive stance. They should be genuinely interested in your company’s future and want to know about its past. When they only seem to show interest or offer any support when an issue develops, the majority of IT suppliers behave more like an insurance provider than a partner. By demonstrating interest, they will be able to plan for the worst-case situation in a way that best suits your particular management style and, consequently, your entire organisation. How can they accomplish that if they are unaware of how your business is actually operated on a daily basis? Since every company in the world is unique, why are we all given the same treatment?
A proactive approach enables you to anticipate situations far more successfully; in the event of an issue that could define your organisation, you and your team will be ready—or at the very least, have developed a plan to keep your team operating. Strategically aligned IT, combined with IT support that reflects it, is crucial because being able to foresee problems prevents small ones from growing into large ones.
Any IT provider who restricts the number of calls you are permitted to make to them should be ignored right away. How could you possible foresee how frequently you may need assistance or when that assistance may be required? The premise alone is absurd. If you knew that, you wouldn’t ever require help in the first place.
Tech-savvy people aren’t always the best communicators; while this doesn’t diminish their technical expertise, it does have an impact on the level of service they can provide. Your IT environment, as well as your provider’s relationship, could suffer from inadequate levels or quality of communication. If they don’t accomplish these things, you have little chance of being ready for any potential problems. They must be in continual touch with you, show interest in your future plans, and discuss with you any concerns they have about the general technology landscape of your firm. You won’t even get this far if there is bad communication from the beginning since how else would they be able to understand your worries and goals, let alone assist you in resolving them?
You’ll communicate with the right providers frequently. Of course, don’t anticipate a call every day; a call once a month will do. They must make a concerted effort to learn about you, your team, your company’s operations in general, its history, and your long-term goals.
The top support providers will not only become technically familiar with your company; they will also want to know what makes it successful, what operational issues can jeopardise it, the difficulties you have on a daily basis, and what they can do to assist.
An SLA is crucial because it outlines the criteria you and the service provider agree upon and outlines the guarantees they provide regarding the service. Take your time to discuss things with potential providers; don’t just accept the first contract that is presented to you. Each supplier has a comparatively unique product.
If the article has the desired effect, it will arm you with the knowledge to make a revised decision about what is best for you, your team, and the future success of your business. The following article will be a checklist of what to look out for in an IT provider, outlining what to look out for on potential providers’ websites.
If you are unsure about any of these areas then please don’t hesitate to reach out to us. Our experts here at 4TC can help you with your IT Support or any other technical queries.
At the height of the pandemic, 49% of UK workers found themselves working remotely at least one day per week. The same figure for 2019 stood at just 12%, illustrating the dramatic culture change that was forced upon workplaces across the country.
Today, whilst many of us have returned to the office on a permanent basis, 22% of employees continue to work remotely at least one day per week, and 11% (roughly one in eight) report working from home exclusively: proof that the concept can endure when correctly implemented.
With benefits ranging from improved employee retention and enhanced work-life balance to reduced pressure on office space and a reduction in costly commutes, enabling remote work can be advantageous for employers and employees alike. However, to ensure your remote team remains motivated and productive it’s vital that they have access to the right tools and services.
At the heart of a workable remote working strategy is a means by which to provide secure, unhindered access to the data and services your remote team need, a concept known as ‘secure remote access.’ In this article we’ll look at some of the most popular options for providing secure remote access with consideration given to the advantages and drawbacks of each solution.
Virtual Private Networks are one of the most popular ways to grant geographically dispersed employees access to corporate resources. VPNs can be configured to allow remote employees to access resources such as email services, enterprise software (scheduling tools, project management tools, CRMs etc) and data vaults without compromising the security of sensitive information.
A VPN extends the privacy afforded by an on-premise network to remote employees by encrypting all transiting data. Authorised end users log into the VPN using a client portal on their endpoint device and this establishes a protected, private connection between the user and the internal network, ensuring sensitive data is kept safe from prying eyes.
Advantages of Remote Access VPNs
Disadvantages of Remote Access VPNs
Remote Desktop setups allow remote users to take command of a specific, office-based computer via an active internet connection. IT support is a common application of such technology, allowing technicians to diagnose issues and apply fixes to a machine without having to be physically present.
Using a client application installed on the remote device, users can access files and applications stored on an elected ‘host’ device as though they were sitting in front of it, with the service transmitting mouse and keyboard inputs from the client device to the host. In addition to desktop PCs, Remote Desktops can also be used to access servers and virtual server environments.
Advantages of Remote Desktops
Disadvantages of Remote Desktops
While SD-WAN (Software-defined wide area network) is most often associated with medium to large companies seeking to connect multiple branch offices to their headquarters, the technology is increasingly being used to keep remote workers connected to the cloud-hosted and on-premise resources they need access to.
Wide area network (WAN) connections extend corporate networks over a large geographical area using numerous connectivity technologies, including MPLS, wireless data connections, broadband, VPNs and the internet, allowing employees to access the resources they need from any location. SD-WANs feature software which monitors the quality of these connections, ensuring data is routed via the most reliable and efficient pathway at all times.
SD-WANs not only connect remote users with office-hosted apps and data, they also link to cloud-hosted services including trusted SaaS providers for a full-optimised remote access experience often backed by SLA-defined performance level guarantees.
Advantages of SD-WAN
Disadvantages of SD-WAN
Over the last decade, the ‘Cloud’ has transformed the way businesses access IT services and launch new tech projects, with software, infrastructure and development platforms accessible on a subscription basis via cloud service providers. Today, the Software-as-a-service marketplace features enterprise software for almost any conceivable business function, from marketing and HR to accounting and sales.
Cloud-hosted software combined with cloud storage can be a convenient, secure and low-hassle way to give remote employees access to the apps and data they need.
Microsoft 365 is perhaps the pre-eminent SaaS offering, featuring a broad suite of tools designed to empower remote workforces.
Advantages of SaaS
Disadvantages of SaaS
The technology is primed to become faster, more versatile, and thankfully cheaper.
Technologies enabled by quantum science will help researchers better understand the natural world and harness quantum phenomena to benefit society. They will transform health care, transportation, and communications, and enhance resilience to cyber threats and climate catastrophes. For example, quantum magnetic field sensors will enable functional brain imaging; quantum optical communications will permit encrypted communications; and quantum computers will facilitate the discovery of next-generation materials for photovoltaics and medicines.
Currently, these technologies rely on materials that are expensive and complicated to prepare, and they often require expensive and bulky cryogenic cooling to operate. Such equipment relies on precious commodities such as liquid helium, which is becoming increasingly costly as the global supply dwindles. 2023 will see a revolution in innovations in materials for quantum, which will transform quantum technologies. Alongside reducing environmental demands, these materials will allow for room-temperature operation and energy saving, as well as being low-cost and having simple processing requirements. To optimize their quantum properties, research labs can manipulate chemical structure and molecular packing. The good news is that physicists and engineers have been busy, and 2023 will see these materials moving from science labs to the real world.
Recently, the UK Engineering and Physical Sciences Research Council announced a vision for innovation in materials for quantum technologies, led by Imperial College London and the University of Manchester. The London Centre for Nanotechnology—a collaboration of hundreds of researchers across Imperial, King’s and University College London—has considerable expertise in the simulation and characterization of quantum systems. The UK’s home for measurement—the National Physical Laboratory—just opened the Quantum Metrology Institute, a multimillion-pound facility dedicated to the characterization, validation, and commercialization of quantum technologies. Working together, researchers and industry will usher in a new era in pharmaceuticals, cryptography, and cybersecurity.
Qubits, the building blocks of quantum computers, rely on materials with quantum properties, like electron spin, which can be manipulated. Once we can harness these properties, we can control them using light and magnetic fields, creating quantum phenomena such as entanglement and superposition. Superconducting qubits, the current state-of-the-art for qubit technology, comprise Josephson junctions that operate as superconductors (materials that can conduct electricity with zero resistance) at super-low temperatures (–273ºC). The harsh temperature and high-frequency operation requirements mean that even the most basic aspects of these superconducting qubits—the dielectrics—are tricky to design. At the moment, qubits include materials like silicon nitride and silicon oxide, which have so many defects that the qubits themselves have to be millimeter-sized to store electrical field energy, and crosstalk between adjacent qubits introduces considerable noise. Getting to the millions of qubits required for a practical quantum computer would be impossible with these materials.
2023 will see more innovation in the design of materials for quantum technologies. Of the many awesome candidates considered so far (e.g., diamonds with nitrogen vacancy defects, van der Waals/2D materials, and high-temperature superconductors), I’m most excited about the use of molecular materials. These materials are designed around carbon-based organic semiconductors, which are an established class of materials for the scalable manufacture of consumer electronics (having revolutionized the multibillion-dollar OLED display industry). We can use chemistry to control their optical and electronic properties, and the infrastructure surrounding their development relies on established expertise.
For example, chiral molecular materials—molecules that exist as a pair of non-superimposable mirror images—will revolutionize quantum technologies. Thin, single-handed layers of these remarkably versatile molecules can be used to control the spin of electrons at room temperature. At the same time, the long spin coherence times and good thermal and chemical stability of metal phthalocyanines will see them being used to carry quantum information.
While 2023 will undoubtedly see more bombastic headlines about the operating speeds of quantum computers, materials scientists will be studying, discovering, and designing the next-generation of low-cost, high-efficiency, and sustainable quantum technologies.
Source: New Materials Will Bring the Next Generation of Quantum Computers | WIRED UK
With cloud costs and complexity higher than expected, many enterprises are making a U-turn and putting applications and data back in traditional systems.
Here’s a topic we don’t discuss as much as we should: public cloud repatriation. Many regard repatriating data and applications back to enterprise data centers from a public cloud provider as an admission that someone made a big mistake moving the workloads to the cloud in the first place.
I don’t automatically consider this a failure as much as an adjustment of hosting platforms based on current economic realities. Many cite the high cost of cloud computing as the reason for moving back to more traditional platforms.
High cloud bills are rarely the fault of the cloud providers. They are often self-inflicted by enterprises that don’t refactor applications and data to optimize their cost-efficiencies on the new cloud platforms. Yes, the applications work as well as they did on the original platform, but you’ll pay for the inefficiencies you chose not to deal with during the migration. The cloud bills are higher than expected because lifted-and-shifted applications can’t take advantage of native capabilities such as auto-scaling, security, and storage management that allow workloads to function efficiently.
It’s easy to point out the folly of not refactoring data and applications for cloud platforms during migration. The reality is that refactoring is time-consuming and expensive, and the pandemic put many enterprises under tight deadlines to migrate to the cloud. For enterprises that did not optimize systems for migration, it doesn’t make much economic sense to refactor those workloads now. Repatriation is often a more cost-effective option for these enterprises, even considering the hassle and expense of operating your own systems in your own data center.
In a happy coincidence, the prices of hard drive storage, networking hardware, computer hardware, power supplies, and other tech gear dropped in the past 10 years while cloud computing costs remained about the same or a bit higher.
Business is business. You can’t ignore the fact that it makes economic sense to move some workloads back to a traditional data center.
It makes the most sense to repatriate workloads and data storage that typically do a lot of the same thing, such as just storing data for long periods of time without any special data processing (e.g., no advanced artificial intelligence or business intelligence). These workloads can often move back to owned hardware and show a net gain ROI. Even with the added costs to take over and internalize operations, the enterprise saves money (or a lot of money) compared to equivalent public cloud hosting.
However, don’t forget that many workloads have dependencies on specialized cloud-based services. Those workloads typically cannot be repatriated because affordable analogs are unlikely to run on traditional platforms. When advanced IT services are involved (AI, deep analytics, massive scaling, quantum computing, etc.), public clouds typically are more economical.
Many enterprises made a deliberate business decision at the time to absorb the additional costs of running lifted-and-shifted applications on public clouds. Now, based on today’s business environment and economics, many enterprises will make a simple decision to bring some workloads back into their data center.
The overall goal is to find the most optimized architecture to support your business. Sometimes it’s on a public cloud; many times, it’s not. Or not yet. I learned a long time ago not to fall blindly in love with any technology, including cloud computing.
2023 may indeed be the year we begin repatriating applications and data stores that are more cost-effective to run inside a traditional enterprise data center. This is not a criticism of cloud computing. Like any technology, cloud computing is better for some uses than for others. That “fact” will evolve and change over time, and businesses will adjust again. No shame in that. Source: 2023 could be the year of public cloud repatriation | InfoWorld
Depending on who you ask, some say that quantum computers could either break the Internet, rendering pretty much every data security protocol obsolete, or allow us to compute our way out of the climate crisis.
These hyper-powerful devices, an emerging technology that exploits the properties of quantum mechanics, are much buzzed about.
Only last month, IBM unveiled its latest quantum computer, Osprey, a new 433 qubit processor that is three times more powerful than its predecessor built only in 2021.
But what is all the hype about?
Quantum is a field of science that studies the physical properties of nature at the scale of atoms and subatomic particles.
Proponents of quantum technology say these machines could usher in rapid advances in fields like drug discovery and materials science – a prospect that dangles the tantalising possibility of creating, for example, lighter, more efficient, electric vehicle batteries or materials that could facilitate effective CO2 capture.
With the climate crisis looming, and technology with a hope of solving complex issues like these are bound to draw keen interest.
Little wonder then, that some of the largest tech companies in the world – Google, Microsoft, Amazon, and, of course, IBM to name a few – are investing heavily in it and angling to stake their place in a quantum future.
Given these utopic-sounding machines are drawing such frenzied interest, it would perhaps be useful to understand how they work and what differentiates them from classical computing.
Take every device that we have today – from the smartphones in our pockets to our most powerful supercomputers. These operate and have always operated on the same principle of binary code.
Essentially, the chips in our computers use tiny transistors that function as on/off switches to give two possible values, 0 or 1, otherwise known as bits, short for binary digits.
These bits can be configured into larger, more complex units, essentially long strings of 0s and 1s encoded with data commands that tell the computer what to do: display a video; show a Facebook post; play an mp3; let you type an email, and so on.
But a quantum computer?
These machines function in an entirely different way. In the place of bits in a classical computer, the basic unit of information in quantum computing is what’s known as a quantum bit, or qubit. These are typically subatomic particles like photons or electrons.
The key to a quantum machine’s advanced computational power lies in its ability to manipulate these qubits.
“A qubit is a two-level quantum system that allows you to store quantum information,” Ivano Tarvenelli, the global leader for advanced algorithms for quantum simulations at the IBM Research Lab in Zurich, explained to Euronews Next.
“Instead of having only the two levels zero and one that you would have in a classical calculation here, we can build a superposition of these two states,” he added.
Superposition in qubits means that unlike a binary system with its two possible values, 0 or 1, a qubit in superposition can be 0 or 1 or 0 and 1 at the same time.
And if you can’t wrap your head around that, the analogy often given is that of a penny.
When it is stationary a penny has two faces, heads or tails. But if you flip it? Or spin it? In a way, it is both heads and tails at the same time until it lands and you can measure it.
And for computing, this ability to be in multiple states at the same time means that you have an exponentially larger amount of states in which to encode data, making quantum computers exponentially more powerful than traditional, binary code computers.
Another property crucial to how quantum computing works is entanglement. It’s a somewhat mysterious feature of quantum mechanics that even baffled Einstein in his time who declared it “spooky action at a distance”.
When two qubits are generated in an entangled state there is a direct measurable correlation between what happens to one qubit in an entangled pair and what happens to the other, no matter how far apart they are. This phenomenon has no equivalent in the classical world.
“This property of entanglement is very important because it brings a much, much stronger connectivity between the different units and qubits. So the elaboration power of this system is stronger and better than the classical computer,” Alessandro Curioni, the director of the IBM Research Lab in Zurich, explained to Euronews Next.
In fact, this year, the Nobel Prize for physics was awarded to three scientists, Alain Aspect, John Clauser, and Anton Zeilinger, for their experiments on entanglement and advancing the field of quantum information.
So, in an admittedly simplified nutshell, these are the building blocks of how quantum computers work.
But again, why do we necessarily need such hyper-powerful machines when we already have supercomputers?
“[The] quantum computer is going to make, much easier, the simulation of the physical world,” he said.
“A quantum computer is going to be able to better simulate the quantum world, so simulation of atoms and molecules”.
As Curioni explains, this will allow quantum computers to aid in the design and discovery of new materials with tailored properties.
“If I am able to design a better material for energy storage, I can solve the problem of mobility. If I am able to design a better material as a fertiliser, I am able to solve the problem of hunger and food production. If I am able to design a new material that allows [us] to do CO2 capture, I am able to solve the problem of climate change,” he said.
But there could also be some undesirable side effects that have to be accounted for as we enter the quantum age.
A primary concern is that quantum computers of the future could be possessed of such powerful calculation ability that they could break the encryption protocols fundamental to the security of the Internet that we have today.
“When people communicate over the Internet, anyone can listen to the conversation. So they have to first be encrypted. And the way encryption works between two people who haven’t met is they have to rely on some algorithms known as RSA or Elliptic Curve, Diffie–Hellman, to exchange a secret key,” Vadim Lyubashevsky, a cryptographer at the IBM Research Lab in Zurich, explained.
“Exchanging the secret key is the hard part, and those require some mathematical assumptions which become broken with quantum computers”.
In order to protect against this, Lyubashevsky says that organisations and state actors should already be updating their cryptography to quantum-safe algorithms ie. ones that cannot be broken by quantum computers.
Many of these algorithms have already been built and others are in development.
“Even if we don’t have a quantum computer, we can write algorithms and we know what it will do once it exists, how it will run these algorithms,” he said.
“We have concrete expectations for what a particular quantum computer will do and how it will break certain encryption schemes or certain other cryptographic schemes. So, we can definitely prepare for things like that,” Lyubashevsky added.
“And that makes sense. It makes sense to prepare for things like that because we know exactly what they’re going to do”.
But then there is the issue of data that already exists which hasn’t been encrypted with quantum-safe algorithms.
“There’s a very big danger that government organisations right now are already storing a lot of Internet traffic in the hopes that once they build a quantum computer, they’ll be able to decipher it,” he said.
“So, even though things are still secure now, maybe something’s being transmitted now that is still interesting in ten, 15 years. And that’s when the government, whoever builds a quantum computer, will be able to decrypt it and perhaps use that information that he shouldn’t be using”.
Despite this, weighed against the potential benefits of quantum computing, Lyubashevsky says these risks shouldn’t stop the development of these machines.
“Breaking cryptography is not the point of quantum computers, that’s just a side effect,” he said.
“It’ll have hopefully a lot more useful utilities like increasing the speed with which you can discover chemical reactions and use that for medicine and things like that. So this is the point of a quantum computer,” he added.
“And sure, it has the negative side effect that it’ll break cryptography. But that’s not a reason not to build a quantum computer, because we can patch that and we have patched that. So that’s sort of an easy problem to solve there”.
Source: What is quantum computing and how will quantum computers change the world? | Euronews
Email: support@4tc.co.uk
Tel: 020 7250 3840
5th Floor, 167‑169 Great Portland Street
London
W1W 5PF
Dew Gates The Street
High Roding
Essex
CM6 1NT