BYOAI, are you one of them?
Now I understand why my associate kept harping on, “The first step to managing cybersecurity threats is to understand the relationship between information and the information system,” in his cybersecurity program.
Recently, the concept of “Bringing Your Own AI to Work” (BYOAI) has been gaining traction in Malaysia. This trend sees employees integrating their own AI tools into their work environments, much like the earlier “Bring Your Own Device” (BYOD) trend. Based on the report, interestingly, Malaysians seem to be more inclined towards adopting BYOAI compared to other countries in Asia. Looks like digitalization in Malaysia may scale faster than expected. While the rapid adoption of AI presents both opportunities, however it will pose challenges for companies trying to keep pace with the trend and without security policies in place.
Ok, perhaps let us explore what exactly entails in this BYOAI and how it can pose risk to organizations.
The common BYOAI tools that we used offen are;
- Personal AI assistance like Google Assistant, Siri, smart email etc.
- Collaboration and Communication like language translate (eg. google translate) or meeting transciption (eg. Otter.ai)
- Writing Assistants such as grammarily for grammar checks and writing improvement
- Learning and Development with AI-driven learning platforms like duolingo etc.
I personally used these tools and they are impressive that can help us to be innovative, efficient and agile with our work.
However, while BYOAI can enhance our productivity, it can also pose risks to the company if not managed properly. For example, if personal AI tools are not adequately secured or compliant with company policies, they could expose sensitive information to unauthorized access. Additionally, compatibility issues with existing systems may lead to inefficiencies and errors. It’s crucial to implement robust security measures and establish clear guidelines to mitigate these risks and ensure the safe integration of personal AI tools into the workplace.
As such, companies must establish a governing rules if they,
- do not have a data security policies and practice in place.
- do not address the compability issues beween different AI tools and platforms as some may need to integrate with the company’s existing systems and workflow.
- do not check to ensure the use of these AI tools adheres to industry regulations and standards.
- do not have a monitoring mechanism in place avoid any ethical and compliance issues.
- do not provide an adequate training to employees on the ethics of using these tools .
As I write this article, I now understand why my associate emphasized, “The first step to managing cybersecurity threats is to understand the relationship between information and the information system,” in his cybersecurity program. This foundational principle highlights the critical need to grasp how data interacts with the systems that store, process, and transmit it. By comprehending this relationship, organizations can better identify vulnerabilities, implement effective security measures, and ultimately safeguard their information assets against cyber threats.
written by Elsie Low
Sign up now!
Check out our feature courses that can enhance your digital transformation journey. Register now to enjoy 10% discount for the next intake.