About me ...
I graduated with a Bachelor's degree in Computer Science, with a major in Artificial Intelligence, in the year 2002. During my final year thesis project, titled "Dynamic Bandwidth Allocation implementing Neural-Fuzzy (Neural Network + Fuzzy Logic) Technique simulated in JAVA Network Simulator," I conducted an intensive study on Neural Networks and Fuzzy Logic. The project aimed to propose a Neural-Fuzzy model to intelligently adjust the bandwidth allocation for ATM networks.
I worked as a Research Assistant for a period of 2 years while simultaneously pursuing my Master of Computer Science degree at the University of Malaya, Malaysia. Throughout my master's program, I engaged in research in various fields, including Image Processing & Computer Vision, Biological Computational Systems, Computer Graphics, and High-Performance Computing. After completing 2 years of research work, I successfully published a journal paper titled "Non-Invasive Method for Patient-Specific Virtual Heart Based on Fiber-Fluid Model." The project received significant recognition and was honored with multiple awards, including the Gold Medal at the Invention Exhibition of New Invention, Techniques, and Products 2005 held in Geneva, as well as the ITEX Bronze Medal at the 16th International Invention Innovation Industrial Design & Technology Exhibition 2005 (ITEX 2005). In recognition of my outstanding achievements, I was bestowed with the 'Saintis Cemerlang' (Excellent Scientist) award by the Ministry of Higher Education Malaysia in August 2005
After completing my MSc in Computer Science in 2005, I joined Intel Corporation as a Station Controller Developer. In this role, I was responsible for developing a series of software to control production machines and gather quantitative monitoring data from these machines. Over the course of 2 years, I gained extensive knowledge in semiconductor assembly line processes and operations. During my time at Intel, I delved into machine vision technology, particularly industrial-based vision technology used for defect detection and monitoring in production lines. Within two years, I successfully delivered four machines to customers and completed projects awarded through government grants. Working in this startup company provided me with invaluable experiences not only in technical aspects but also in marketing, business analysis, and customer service. It allowed me to broaden my expertise in various areas beyond my core technical skills.
I worked as a staff researcher in the Knowledge Technology cluster at the National Research Center Malaysia, MIMOS. My projects were centered around Semantic Technology and Image Understanding, falling under the purview of the Artificial Intelligent Center in the cluster. One of my primary contributions was in the realm of ontology-driven image content understanding and retrieval. I collaborated with principal researchers to build a multi-model processing framework as part of the Semantic Technology platform. Additionally, I played a significant role in the research and development of Image Understanding software components. These components encompassed various low-level image processing techniques, such as image segmentation, extraction of features of interest, color analysis, and image classification, all integrated with semantic technologies for high-level conceptual knowledge analysis. The domains of the image resources involved CCTV surveillance images and health care images like MRI, CT, and X-Ray images.In terms of my achievements, I successfully completed eight papers, seven of which were published, and filed for twelve patents. Moreover, I contributed to the development of three Proof-of-Concept systems and played a role in organizing three AI conferences.
I am thrilled to express my decision to join Hewlett-Packard (HP) Malaysia, a renowned organization deeply engaged in cutting-edge technology, particularly in the domain of Big Data analytics. The opportunity to explore and expand my knowledge in this field excites me immensely. Throughout my career, I have been actively involved in various technologies, including Hadoop, Autonomy, Vertica, and Qlikview. My contributions have been significant in establishing a robust Big Data analytics environment equipped with advanced processing and development capabilities. This environment incorporates a range of Apache Hadoop open source technologies, such as HDFS, HBase, Hive, Flume, Sqoop, Pig, Mahout, and Oozie. My expertise extends to Map-Reduce program development, scripting, and proficient Hadoop processing environment administration. Recognizing the importance of continuous growth, I have actively sourced relevant trainings for our local team and actively participated in these learning opportunities myself. As a vital liaison between multiple sites, including those in the US, India, China, and Malaysia, I have played a crucial role in facilitating seamless communication and engagement on projects related to Autonomy, Vertica, and Qlikview. This endeavor aimed to empower our local team to grasp substantial knowledge in the field of Big Data analytics. Furthermore, HP's culture of meaningful innovation, driven by cutting-edge IT technologies, has inspired me to develop Proof-of-Concepts (PoCs) that demonstrate the transformative potential of these Big Data technologies in various business use cases. Presently, I am actively working on two PoCs—one based on Autonomy's meaning-based computing technology and another centered around Hadoop technology. In addition to hands-on work, I have also dedicated my efforts to document my findings and insights through the creation of two technical white papers based on these PoCs. Overall, I am eager to bring my expertise and passion for technological innovation to Hewlett-Packard, contributing to the company's vision of pioneering advancements in Big Data analytics and making a meaningful impact in the IT industry.
Over the past five years, I have pursued opportunities with end-user environment companies. Initially, I joined Cargill as a Technical Data Architect. Later, I decided to explore the financial domain and joined Bursa Malaysia. During my time as a member of the data analytics division, I played a crucial role in driving the success of the Center of Excellence (CoE) department, which consisted of eight team members. The CoE focused on data product development and providing analytical services for Bursa's line-of-businesses. At Bursa Malaysia, the Center of Excellence (CoE) developed a tactical solution with dashboards and visualizations, automating report generation using a cloud-based data platform (AWS). Within a year, we deployed nine dashboards across four divisions, serving approximately 60 users. Furthermore, the CoE provided training for Bursa's internal departments to create their own dashboards, perform SQL queries, and use Python for data analysis.
In addition to our internal efforts, the CoE established strategic collaborations both internally and externally. This allowed us to consolidate useful data, create business use cases, and acquire alternative datasets. Notably, we successfully developed a Retail Investor CX analytics solution and collaborated with Alliance Bank on SME banking solutions. The CoE also played a vital role in providing internal analytic support, including a Datalake project aimed at enhancing decision-making. Our efforts extended beyond analytics and included tasks related to data governance and developing PoCs covering data warehousing, advanced analytics, and data science. Currently, my focus is on delivering an Enterprise-wide Datalake project, leveraging modern Cloud-based technologies and architecture.
|