Browsing by Author "Rahman, Mohammad Rashedur"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Open Access A Dynamic Replica Placement Strategy in Grid Environment(2006-02-14) Rahman, Mohammad RashedurGrid computing emerges in part from the need to integrate a collection of distributed computing resources to offer performance unattainable by any single machine. Grid technology facilitates data sharing across many organizations in different geographical locations. Data replication is an excellent technique to move and cache data close to users. Replication reduces access latency and bandwidth consumption. It also facilitates load balancing and improves reliability by creating multiple data copies. One of the challenges in data replication is to select the candidate sites where replicas should be placed, which is known as the allocation problem. One performance metric to determine the best place to host replicas is select for optimum average (or aggregated ) response time. We use the p-median model for the replica placement problem. The p-median model has been exploited in urban planning to find locations where new facilities should be built. In our problem, the p-median model finds the locations of p candidate sites to place a replica that optimize the aggregated response time. A Grid environment is highly dynamic so user requests and network latency vary constantly. Therefore, the candidate sites currently holding replicas may not be the best sites to fetch replica on subsequent requests. We propose a dynamic replica maintenance algorithm that re-allocates to new candidate sties if a performance metric degrades significantly over last K time periods. Simulation results demonstrate that the dynamic maintenance algorithm with static placement decisions performs best in dynamic environments like Data Grids.Item Open Access K-nearest Neighbor Rule: A Replica Selection Approach in Grid Environment(2006-02-14) Rahman, Mohammad RashedurGrid technology is developed to share data across many organizations in different geographical locations. Data replication is a good technique that helps to move data because it caches data closer to users. The idea of replication is to store copies in different locations so it can be easily recovered if one copy at one location is lost. Moreover, if data can be kept closer to user via replication, data access performance can be improved dramatically. When different sites hold replicas, there are significant benefits realized when selecting the best replica. Network performance plays a major role in selecting a replica. However, current research shows that other factors such as disk I/O also plays an important role in file transfer. In this paper, we describe a new optimization technique that considers both disk throughput and network latencies when selecting the best replica. Previous history of data transfer can help in predicting the best site that can hold replica. The k-nearest neighbor rule is one such predictive technique. In this technique, when a new request arrives for best replica, it looks at all previous data to find a subset of previous file requests that are similar to it and uses them to predict the best site that can hold replica. In this work, we implement and test the k-nearest algorithm for various file access patterns and compare results with the traditional replica catalog based model. The results demonstrate that our model outperforms the traditional model for sequential and unitary random file access requests.Item Open Access Replica placement and selection strategies in data grids(2007) Rahman, Mohammad Rashedur; Barker, Kenneth E.