ERU - 2013
Permanent URI for this collectionhttp://192.248.9.226/handle/123/14694
Browse
Recent Submissions
- item: Conference-AbstractProceedings of the 19th Annual Research Symposium (Pre Text)(The Engineering Research Unit University of Moratuwa, 2013-11) Hettiarachchi, TSS
- item: Conference-AbstractA Framework to analyze the effectiveness of collaborative e-Learning (CeL) in Sri Lankan University Education(2013) Kanaganayagam, I; Fernando, MSDE-learning introduced new ways of learning using computers and the Internet, and is rapidly evolving with the development of technology. Learners can actively participate and collaborate in a learning process using either synchronous or asynchronous e-learning. Collaborative e-Learning (CeL) is an effective pedagogy approach, which also improves the processing skills, communication skills, and cognitive skills of the students. In this research study, a framework was developed to analyze the effectiveness of CeL in Sri Lankan university education. The study was conducted among lecturers, assistant lecturers, undergraduate students and post-graduate students of 15 Sri Lankan National universities. Interviews were used as the research instrument through different media such as direct interview, telephone interview and Skype interview. Our research findings helped to identify the most effective tools to use in CeL, which improve lecturer-student and student-student interaction. Further, awareness about CeL, perception of the requirement of CeL, and the assessment criteria of the effectiveness of CeL were identified in the context of Sri Lankan university education. In addition, a framework was developed to analyze the effectiveness of CeL in a quantitative manner. Finally, this paper presents recommendations to improve the effectiveness of CeL in Sri Lankan universities and discusses possible areas for future research in this arena.
- item: Conference-AbstractGeneralized extensions for botnet detection(2013) Balasooriya, BCSSA; Fernando, MSDAlong with the improvement of computer technologies, there is a significant change in the threat landscape. Large scale attacks and digital criminal activities have exposed the Internet to serious security breaches, and alarmed the world regarding cyber-crime. The core of these problems are the so called botnets. Botnets have a dynamic and flexible nature. The Bot-masters, who are controlling the botnets, update the bots and change their codes day by day to avoid the traditional detection methods such as signature-based anti-viruses. Additionally, various techniques are employed by Bot-masters to make their botnets undetectable as long as possible. Recent botnets consist of millions of infected machines, making use of this attack vector inevitably harmful. A better understanding of botnets will help to coordinate and develop new technologies to counter this serious security threat. The effort of the research is to analyze the behavior, possible countermeasures and preventive procedures of botnets; and come up with Generalized Extensions for Botnet Detection to detect botnets on computer networks. The proposed Generalized Extensions for Botnet Detection provides a model to detect botnets.
- item: Conference-AbstractDesign and Implementation of a Laboratory-scale Microgrid(2013) Muthugala, VV; Pathirana, NPDS; Nanayakkara, WHKP; Nanayakkara, NPDCD; Hemapala, KTMUThis paper deals with the implementation of a single phase laboratory scale micro grid (MG) including a control system based on emulated energy resources and loads which permits the experimentation of various scenarios. The proposed MG is comprised of a wind turbine simulator, a solar photovoltaic (PV) and a battery bank which are connected to the MG via flexible Voltage Source Inverters respectively. Although a MG can operate either in grid connected mode or islanded mode, in our research we have mainly focused on islanded operation only. In that mode, bidirectional inverter performs a major role in maintaining the voltage and frequency at an acceptable level for safe collaborative operation. The results of these studies show the capability of developing a reliable control mechanism for islanding operation of micro grids based on the proposed concept.
- item: Conference-Extended-AbstractDress and dressing an object of communication and activity of culture(9/11/2015) Karunaratne, PVMDress is a non verbal communicative object and dressing is a part of cultural activity. One of the distinctive features of dress is that a group of people shares a particular patterns or styles of dress. The style of a dress is a consequence of the culture of a society and the traditions which people follow. The way people dress and decorate their body with a variety of jewelry and the methodology of arranging the dress to the body is a communicative activity. Individuals share ideas and beliefs according to their ideologies of culture. Besides, they participate in social agreements by which they live from generation to generation and from individual to individual. The objective of the paper is to investigate the material and non material culture during the period of the Kotte kingdom by focusing on the extended cultural practices amalgamated with dress and dressing. Further this article looks for changes in contextual boundaries of dress signs in order to understand the culture which predominates dress communication. The method adopted for the research was qualitative. Research findings show that dress fashions and dressing in Sri Lanka is made up of a rich set of possible combinations of cultural and communicative objects which entails authentic individuation of an outfit.
- item: Conference-Full-textDevelopment of a new scouring methodology for the textile industry(2013) Wijayapala, UGS; Dharmasena, DKAS; Bandara, DMN; Chathuranga, MAI; Rajapakshe, KSScouring is one of the most important processes in fabric formation in the textile industry. The main function of scouring is the removal of hydrophobic impurities in fabrics made out of natural fibers. Normally scouring is done after the sizes are removed in the desizing process. Three main scouring methods can be identified in the current textile industry. Alkaline scouring, Bio scouring, Solvent scouring are those three main methods with the alkaline scouring being the traditional and widely used method. In this research the drawbacks of the existing scouring methods have been discussed under three main aspects which are efficiency, economy and environmental friendliness. Neither of the above three scouring methods satisfy all these three aspects, at least up to a reasonable level. In order to achieve all these three objectives concurrently, development of a new process by combining existing methods was focused on in this research. This approach has not been followed in the past. Under this project, combining of Alkaline scouring with Bio scouring and Solvent scouring were separately considered as the approach of the research. Widely used Alkaline (NaOH) scouring method has been chosen as the reference method and aspects of this method were compared with experimental results. Recipes were developed according to general requirements of combining agents and tested with 100% cotton twill fabric and results were analyzed. Within all tested recipes the most suitable combination was finalized with necessary conditions in order to achieve better results in terms of efficiency, economic and environment friendliness compared to the reference methodology.
- item: Conference-Full-textFeasibility study on contact dermatitis using textile finishes(2013) Munasinghe, PD; Wanniarachchi, TContact Dermatitis (CD) is a skin disease which can make humans fearful of wearing certain fabrications. Formaldehyde and its compounds are the main causative agents and Nickel and Chromium also work as allergens for CD. Chemicals used in finishing processes are also identified as causative agents and Fiber types can have an effect on the disease. Symptoms of the disease were reviewed using books and previous researches in this study. Reddish blisters, itching and reddish skin are common symptoms of CD. As there is no evidence about CD in Ayurveda, Western medicine and Ayurweda were linked by matching symptoms to find the connectivity of the skin diseases. Ayurweda described about “Kshudra Kushta” which has similar symptoms to those of CD. Herbs such aswild snakegourd, white sandalwood, red sandal wood and Heart leaved Moonseed are used in Ayurveda to cure “Kshudra kushta”. Mixtures of herbs with fixing agents were applied to cotton and polyester fabrications through a natural dyeing method to find a fabric finishing method for CD. Two types of fixing agents; Copper Sulphate and Aluminum Sulphate were applied separately with different amounts to identify best recipe. The herbal mixture has a reddish colour. The mixture was tested with colour fastness to wash test ISO – 165-CO1:1987 and pH value of the olutions was also checked to study the feasibility. Cotton fabrications showed acceptable durability up to three (3) washes and polyester had poor durability. Wash durability is also dependent on the amount of fixing agents. There were slight colour changes after Copper sulphate treatment; Aluminum sulphate did not showed any colour change. It has neutral pH range. Fourteen (14) different garments which cause symptoms of CD were treated with the herbal mixture and wear trials were carried out. All garments showed positive results up to 3 domestic washes.
- item: Conference-Full-textDevelopment of a comprehensive fabric quality grading system for selected end uses(2013) Balasooriya, MC; Athukorala, DMA; Sanjaya, MGC; Mayura, S; Niles, SN; Abeysooriya, RPThis study focuses on the development of a comprehensive fabric quality grading system for selected end uses. This system goes beyond currently existing methods by reflecting the suitability of a candidate fabric for a specific end use, by evaluating its key properties and grading the fabric with respect to its overall quality level and has been developed by studying the retailer fabric specification standards. A set of fabric parameters was selected for each of four retailer customers who were identified by an industrial survey. The selected fabric parameters were transformed into a sub-index value calculated by an equation for each parameter using test values obtained from the considered fabric. Weights were assigned to the parameters considering the level of importance identified by the survey for each fabric parameter. A weighted arithmetic mean function was used as the aggregation function in which the aggregate of the products of sub index value and the weighting for each arameter were taken as the overall fabric quality value on a scale of zero to hundred. This system is designed to assist decision makers in selecting a suitable fabric material for a specific end use by comparing the overall quality of several fabrics. A computer application was developed as the user interface to evaluate fabrics using the developed system. The results obtained from this system compared favourably with those obtained through manual evaluation of the fabric
- item: Conference-Full-textDeveloping a system to calculate cutting process time of garments(2013) Upendra, RAS; Kaluarrachchi, P; Weerasekara, KMRD; Ratnayake, VSThe paper includes a methodology to calculate the fabric laying and cutting times for a cutting department of an apparel manufacturing company. The methodology was developed based on the data collected for woven trouser patterns. A data base of basic times was developed for fabric laying and cutting operations. By analyzing the variations in time values in relation to the parameters that effect the process relationships were developed which were used to calculate the standard minute values for the process. The methodology can be used to calculate the SMV for a lay and it can be further used to develop incentive schemes for the cutting department.
- item: Conference-Full-textSiyapath: A P2P gossip based volunteer computing framework(2013) Silva, WGHAM; Hewage, N; Dhanushka, WDM; Nufail, MNM; Nanayakkara, V; Perera, SThis paper presents Siyapath, a distributed volunteer computing framework prototype. It takes a novel approach in solving the problem of increasing demand for high performance computing. High performance computing addresses the demands on computationally intensive tasks. Volunteer computing is one approach to address the high performance computing requirements. Volunteer computing uses computational power volunteered by the general public, to perform computationally intensive tasks in a distributed manner. A volunteer computing framework provides the infrastructure for a volunteer computing network to operate. The solution presented, Siyapath, is a peer-to-peer, gossip based volunteer computing framework. Existing volunteer computing framework implementations are discussed in this paper, along with their pitfalls and fallacies. The peer-to-peer architecture was incorporated into Siyapath with the intention of overcoming drawbacks present in volunteer computing frameworks based on a client-server architecture. Siyapath uses the functionality provided by Apache Thrift as a cross-language services development framework and uses a gossip based protocol for communication. How Thrift and gossip based protocols help to improve the performance of theframework is also discussed. A proof-of-concept implementation of Siyapath has been completed which comprises of the basic functionality of a volunteer computing framework. Details of this design and implementation are described in the paper. Performance tests including a load test and a scalability test have been carried out to measure how the framework performs as the network scales. fn the latter part of the paper, the results of the tests are presented and analyzed.
- item: Conference-Full-textRFID Assisted smart conveyor system with industrial robot hand(2013) Piyasena, RSV; Herath, HMDB; Prasanna, PPGC; Amarasinghe, YWRIn ~his ~aper a prototype smart conveyor system is developed with the integration of Radio Frequency Identification (RFID) technology and a robot hand for product handling in a manufacturing environment. A novel Flexible Manufacturing System (FMS) with customer based production is introduced using web-services techno.'ogy and a real-time updatable inventory database to enhance operational efficiency of the manufacturing environment. The FMS designed is a fully automated system with RFID tags/detectors, intelligent control systems, robot arms and sorting mechanisms, smart conveyor system and a real time updatable inventory database with application software. This study demonstrates the significance and benefits of a sr:zart conveyor sys~em with the integration of RFlD technology for product identification and handling, specifically tn the manufacturing industry.
- item: Conference-Full-textRepresentation of web based graphics and equations for the visually impaired(2013) Gunawardhana, CLR; Hasanthika, HHM; Piyasena, TDG; Pathirana, SPDP; Fernando, MSD; Perera, AS; Kohomban, UWith the extensive growth of technology, it is becoming prominent in making learning more interactive and effective. Due to the use of Internet based resources in the learning process, the visually impaired community faces difficulties. In this research we are focusing on developing an e-Learning solution that can be accessible by both normal and visually impaired users. Accessibility to tactile graphics is an important requirement for visually impaired people. Recurrent expenditure of the printers which support graphic printing such as thermal embossers is beyond the budget for most developing countries which cannot afford such a cost for printing images. Currently most of the books printed using normal text Braille printers ignore images in documents and convert only the textual part. Printing images and equations using normal text Braille printers is a main research area in the project. Mathematical content in a forum and simple images such as maps in a course page need to be made affordable using the normal text Braille printer, as these functionalities are not available in current Braille converters. The authors came up with an effective solution for the above problems and the solution is presented in this paper.
- item: Conference-Full-textReal time personalized aggregated activity stream for an organization(2013) Kajaruban, S; Madarasinghe, BR; Withanage, SPIn a large organization there is a bulk information pool which is aggregated from internal and external sources of the organization. It is often very hard to keep track of every part of information that is published in different systems. This paper introduces a novel system which is able to automatically personalize the Activity Streams after aggregating and consolidating from different independent systems (e.g. in a software company context: version management system, wikis, bug trackers etc.).It retrieves the information by sensing the activities from these independent systems and then it is processed and transformed into news feed along with its specific standardized format. After generating the news feed, the system personalizes the feed using employee information. Finally, it shows these personalized Activity Streams to the user interface which is able to automatically motivate users to find relevant information easily without either missing any data or losing valuable time. It also suggests more information based on the employee past history. This system solves the problem of 'irformation silos' by reducing and managing information in a structured way.
- item: Conference-Full-textPC Based open standard radar display system(2013) Attanayake, WMPP; Deshapriya, KKVVC; Dissanayake, KDMK; Ekanayake, EMKUBThis paper describes an implementation of a new PC based RADAR monitoring system which mainly eliminates some limitations in the current monitoring system and introduces some advanced features over the current system. The project covers implementation of two main functional units, open standard interface unit and web based monitoring unit. The open standard interface enables interfacing different Radar data protocols to a normal PC through the USB interface and this data can be used to develop any application that may process radar data and in this particular project we have developed a display application to monitor traffic. Providing a web interface to monitor the radar data enables the user to access the system remotely. We have also added features such as Traffic Statistic Recording, identifying various issues in radar units such as loss targets, patches etc with the purpose of enhancing the functionality of the display system
- item: Conference-Full-textOptimization methodologies for building performance modeling and optimization(2013) Bandara, RMPS; Attalage, RABuildings account for approximately 40% of the global energy consumption and 36% of total carbon dioxide emissions. At present, high emphasis is given on the reduction of energy consumption and carbon footprint by optimizing the performance and resource utilization of buildings to achieve sustainable development. Building performance is analyzed in terms of energy performance, indoor environment for human comfort & health, environmental degradation and economic aspects. As for the energy performance analysis, this can be best modeled and optimized by a whole building energy simulation tool coupled with an appropriate optimization algorithm. Building performance optimization problems are inherently multivariate and multi-criteria. Optimization methodologies with different characteristics that are broadly classified as Adaptive, Non-adaptive and Pareto Algorithms can be applied in this regard. The paper discusses the applicability of the aforementioned optimization methodologies in building performance optimization for achieving realistic results.
- item: Conference-Full-textIntelligent lighting controller for domestic and office environments(2013) Wijesuriya, WASI; Perera, AHP; Gayan, JMU; Attalage, RA; Dassanayake, VPCThe use of natural light instead of artificial light to conduct activities has been shown to havepositive physical and psychological effects in humans. Thus the growing trend of incorporating natural light in office spaces and households has created a need for control between the sources of natural and artificial light. Providing autonomous adequate natural light when it is present and compensating when the light level does not meet the required level, is the primary task of such controllers. Furthermore, saving energy by operating intelligently according to user presence and demand is the other aspect such controllers strive to achieve. The aim of the project is to develop a system which addresses aspects of controlling both natural and artificial light inside a room efficiently and at the same time being cost effective in installation. The project aims to develop a system which is adapted to conditions found in Sri Lanka and to research on the preference of light levels in defined groups of people (consisting of Sri Lankans). After which a mathematical model is developed to achieve the aforementioned criteria of balancing light sources to a user.
- item: Conference-Full-textImproving geospatial data discovery by enhancing public metadata catalog search services(2013) Careem, M; Karunarathne, DMost public geospatial data is served via standard compliant Web Mapping Servers and Web Feature Servers available globally. However, it is quite complicated to search and locate the relevant data if the exact data source name and other parameters are unknown. Catalog servers, which store metadata and provide search protocols provide a standard way to handle this problem, but only afew Catalog servers exist which limits the usefulness of such servers for adhoc geospatial data discovery. This paper looks at the advantages of making available metadata of geospatial data through Catalog Servers for adhoc geospatial data discovery. It looks at the limitations of searching geospatial data using Web Mapping Servers and Web Feature servers and search engines such as Google, and presents a case for having increased Catalog Servers. It then looks at a novel way of building metadata from existing Web Mapping Servers using common data base techniques, which could then lead to more metadata in Catalog Servers, which would in turn lead the way to better and more efficient Geospatial Data Discovery. It then looks at a specialized client that is used to search catalog servers over the web, providing domain experts with a powerful tool to accurately search for data. 1. Introduction I The rapid increase of public spatial data on the Internet, coupled with the need for using spatial data in meaningful ways, is shifting the focus towards the discovery and integration of data. During the past decade there has been tremendous growth in the number of spatial data sources available as public web services, and these sources are being used in domains as diverse as finance, education, citizen services, digital media and emergency response, to name a few. The majority of this data is made available to the general public as raster maps in Web Mapping Service (WMS) Servers [I] and vector data on Web Feature Service (WFS) Servers [2]. The Geospatial Information Database (GlOB) [3] project lists 1400 WMS servers serving over 330,000 layers, indicating the scale of data that needs to be searched to identify data. With such large amounts of data available for public use, the task is left to the end user to search, identify and fetch the relevant spatial data he/she is looking for. The ability to discover and access geographical data for planning, visualization and decision making is a requirement to support communities and activities at local, regional, national and international level. This is summarized well in this statement from the Spatial Data Infrastructure [4] cookbook in its chapter on geospatial M. Careem and D. Karunarathne are with the
- item: Conference-Full-textGeneric log file data extraction(2013) Bandara, TPSH; Chandrasekara, WKMSP; Chathunga, JAR; Chiranjeewa, KAL; Wimalasuriya, DC; Fernando, MBTL; Jayathilake, PWDCAutomated software log file analysis holds an important position in software maintenance. Currently available analysis tools are not generic. They tend to focus on specific software or servers and their flexibilities are minimal. Furthermore, costs of commercially available log analysis tools are not affordable for small and medium scale firms. This has left a void in the market for generic, customizable and open source log file analysis tools. The impediment to such a tool emerging is the unavailability of a generic log file data extraction mechanism. A generic log file format definition language and an underlying persistent data storage system is a solution to this problem. Log file structures could be defined by the aforementioned language and the data extracted would be stored in the persistent storage. This methodology enables generic log file analysis on top of the extracted data. Through the research and implementations carried out, it was identified that a modified version of simple declarative language is suitable for the log file format definition language. II would have the capability of handling and defining all patterns of text based log files. Additionally. the results revealed that the appropriate storage mechanism would be an Extensible Markup Language (XML) database mainly because of the similarities between the hierarchical nature of XML and common log file structures.
- item: Conference-Full-textForevidizer forensic video & image analyzing toolkit(2013) Wijayasiri, A; Sampath, C; Rathnayaka, N; Jayaweera, R; De Silva, CDigital videos and images have become a common thing in life. More and more sophisticated tools are becoming available for the general consumers. With the advancement of digital image processing and video processing technologies, various kinds of images and videos are produced from different perspectives. As a result videos can be used for various frauds and illegal activities. Legislative changes have been made to accept videos and images from digital cameras as witnesses for legal proceedings. Consequently there is a growing interest in forensic analysis of video content where the integrity of digital images and videos need to be checked. In this respect it has become essential to have a proper toolkit to analyze whether a particular video is a real one or one that has been tampered with. As video editing techniques are getting very sophisticated, tampered videos are hard to detect. However, when a video is tampered with, some of the basic properties of the video are changed. Then to detect those changes it is needed to use complex image processing and video processing techniques and algorithms. We present methods to analyze these properties of a given video, and produce statistical details for the video to ascertain whether it is tampered with or not, and if it is tampered with then what changes have been made. Video frame duplicate detection, video double MPEG compression detection, image double JPEG compression detection and duplicated regions within image detection are the basic methods offorgery detections.
- item: Conference-Full-textFlight search optimization using in-memory data management(2013) Dimithrie, PS; Dias, GIn current travel planning systems, even for a relatively straightforward round-trip query, it is not uncommon to spend more than 30 seconds. If the results can be produced as quickly as possible it will res u I t in a competitive advantage in the market since the delay is undesirable for the user of the system, as it reduces inter activity. Traditionally, data will be placed in storage then, when needed, will be accessed and acted upon in the computer's memory which results in a natural bottleneck that reduces speed. With the emergence of multi-core processors and availability of large amounts of main memory at low cost new breakthroughs in the software industry such as in-memory technology are being created In-memory and multicore technology have the potential to improve the performance. If all data can be stored in the main memory instead of on disk, the performance of operations on data, especially on mass data, is improved. In this paper it is intended to take the advantage of in memory technology, where all the data resides and has been processed in the main memory and develop a CPU based algorithm in order to optimize the flight and air fare search in air travel planning, basically using hashing technique. This algorithm also has the potential to take the advantage of multi core processors in the future since it used in-memory data management. With the use of Google hash maps the memory has been used effectively. With the selected sample data almost all the searches could be performed in milliseconds. Also with the increase of the maximum number of connecting airports, searching time is also increased.