Presentations
Undergraduates
Author | Supervisor | Title |
---|---|---|
Edward DeSaulniers | A. El-Rabbany | Is GPS a Good Business Decision for Small Land Surveying Companies? |
Jamie Wheaton | A. El-Rabbany | The Integration of VOCARTA in the New Brunswick Department of Transportation |
Peter Yorke | D. Wells | The Standardization of Data for Updating Hydrographic Charts |
Graduates
Author | Supervisor(s) | Title & Abstract |
---|---|---|
Rima Ammouri | D. Coleman J. McLaughlin |
Impact of Communication and Information Technologies on Public Participation Tools for Land Management: The New Brunswick Land Gazette This paper presents a theoretical framework describing the alternative impacts of communication and information technology on public participation methods and techniques. As Geographic Information Systems (GIS) are introduced into public participation settings in increasing numbers, a taxonomy of ongoing research into "Public Participation GIS" (PPGIS) and its related areas will be presented. Then, the author will discuss the impact of information technology (IT) on land information system development in New Brunswick and its modernization from land data bank to an integrated, Internet-based, land information system (Land Gazette). The vision of the Land Gazette is to develop new communication tools between government and the public in order to provide electronic public notice about land and land use, and an information and feedback mechanism that improves public participation into the land management processes. After evaluating the potential impact of the Land Gazette, the author will discuss its future improvements and trends in order to achieve better public participation in land management. |
Sunil Bisnath | R. Langley | Precision Assessment of GPS-Based Geometric Tracking of Low Earth Orbiters In recent years the Global Position System (GPS) has been utilised for precise, a posteriori low earth orbiter (LEO) tracking. This is accomplished by placing a GPS receiver aboard the spacecraft, and optimally combining high-fidelity models of the LEO orbital dynamics, with computed relative positions between the LEO GPS antenna and stationary terrestrial antennas of known position. This is a complex and onerous task. A data processing method - the "GPS-based geometric tracking strategy" - is proposed which consists of a kinematic sequential least squares filter/smoother. This strategy requires only GPS measurements and freely-available related information. It is therefore computationally more efficient than the conventional tracking approach. To test the performance of this strategy the filter/smoother has been realised in computer code and applied to simulated data to assess the precision of the LEO position estimates. Results indicate that sub-decimetre precision of the LEO position in each Cartesian component is achievable with only tens of minutes of processing time. The position noise attributable to each error source was also determined and shows that the greatest error source is the GPS receiver measurement noise. |
James Clarke | D. Wells | A Review of the Digital Nautical Chart The introduction of electronic charts and their display systems has greatly improved the mariner's ability to absorb information and assess developing navigation and collision avoidance situations. In order to ensure that the mariner is receiving data that is "paper equivalent" a number of standards, and associated product specifications have been developed. One such product is the Digital Nautical Chart (DNC), an electronic chart that has been developed to fulfil the maritime navigation requirements of NATO nations. The objective of this paper is to examine DNC documentation and present enough relevant information to provide the non-DIGEST expert with an understanding of he product. The report examines the DNC/DIGEST relationship, the VPF/DNC data model, as well as the implementation of the standard. In the final sections of the report, the DNC update problem is explained and an end-user solution using a corrections layer is recommended. |
Tianhang Hou | L. Mayer | Noise and Artifact Reduction The wavelet transform is a new tool for signal and image processing. It is similar to the short time Fourier transformation, and can be used for decomposition of non-stationary signals into time-frequency atoms. Wavelets enable a signal to be analyzed at different frequencies with different resolutions. A wavelet transform possesses several properties such as accurate local description, perfect reconstruction, and flexibility of the basis function selection. These are useful in applying and analyzing diverse types of signal. The localizing or concentrating properties of the wavelet transform makes it particularly powerful in signal and image compression, de-noising, and feature detection. In this paper, the basic concepts, properties, and computing algorithms of wavelets are introduced and compared to traditional signal processing methods. As an example of a wavelet application, the de-noising (removing the random noise in a signal and the artifacts in an image) of ship-track artifacts in multibeam sidescan imagery is presented. This is almost impossible to achieve using traditional signal processing methods. |
Jianliang Huang | P. Vaníček | Effects of Topographical Mass Density Variation on Gravity and Geoid in the Canadian Rocky Mountains Gravity reduction from the earth surface to the geoid requires a knowledge of topographical mass density. However, in practice the mean density (2.67g/cm3) is mostly used to approximate the actual density because of the difficulty and complexity of getting the actual density. This approximation introduces errors in the reduced gravity, and consequently, in the geoid. The Geographical Information Systems (GIS) offer the possibility of introducing the actual density data by overlaying bedrock density and digital geological maps. As a part of the effort towards the construction of the 'one centimetre geoid' for Canada, the effects of lateral topographical density variation on gravity and geoid were investigated in the area of the Canadian Rocky Mountains. Density values were estimated from the geological maps of Canada and the United States and bedrock density tables were compiled for the use in the Arcview GIS. The 5' by 5' mean and point effects were computed from height and density data on a 30'' by 60'' grid. The mean direct (topographical) density effect (DDE) on gravity ranges between -7.9 and 3.8 mGal (mean of 0.001 mGal), at the earth surface, and from -22.3 to 13.8 mGal (mean of 0.002 mGal), at the geoid. The secondary indirect (topographical) density effect (SIDE) on gravity varies between -11 and 7uGal. The primary indirect (topographical) density effect (PIDE) on geoid changes from -6.7 to 4.3cm (mean of 0.5 cm). The total topographical density effect on the geoid ranges between -12.9 and 6.4 cm (mean of 0.8 cm). Our results suggest that the effect of topographical density lateral variations is significant enough and ought to be taken into account for the determination of the one centimetre geoid. |
Edouard Kammerer | J. Hughes Clarke | The Influence of Sound Speed Variations at the Face of Multibeam Sonar Transducers: the Effect of Array Geometry and Array Orientation The purpose of this paper is to present a study of the shape of the distortions generated in multibeam sounding solutions due to changes in the surface sound speed in the ocean. In order to make the distortions as visible as possible, a hypothetic flat seafloor model has been chosen. Different sonars array geometries are considered : curved array sonars, linear arrays using FFT beamformers, motion stabilised sonars and dual transducer array sonars. The shape of the artifact depends on two major characteristics of the beamforming method used in these systems : 1.- the steering process of a beam makes assumptions about the wavelength of sound and changes in this causes deflection of the beam from its original direction, 2.- the ship roll stabilisation by the sonar maintains the swath symmetrical with respect to the vertical irrespective of the roll, however the refraction effect, due to steering, follows the roll. This study permits one to understand and to be able to modelize refraction artifacts in order to facilitate their removal. |
Jane Law | D. Coleman J.McLaughlin D. Willms |
The Analysis of Spatial Clusters With the increasing availability of large spatial databases, new tools are needed for users to analyze, explore, summarize and display huge sets of spatial data. One important issue to address when dealing with large spatial databases is the aggregation or exploration of similar and neighboring observations in the analysis of spatial clusters. This paper introduces an aggregation index, L, that can be used to analyze spatial clusters formed using one variable. The index is originally derived in a research to aggregate contiguous areas to form clusters of similar characteristics. It is later found to be very useful in the general study of spatial association (identifying clusters of similar and dissimilar values), as performed by other indicators such as the Moran's I, Moran scatterplot and Gi* statistic. This paper looks into this particular application of the index in identifying clusters of similar and dissimilar values. It shows the derivation of the index, its properties and advantages. The index measures the homogeneity within a cluster and heterogeneity between the cluster and a reference (e.g., grand mean). Homogeneity is the sum of squared difference between the observations within the cluster and their mean. Heterogeneity is the sum of squared difference for each observation between the mean of observations in the cluster and the reference. The aggregation index, L, for a cluster is the ratio of its heterogeneity over its total (sum of homogeneity and heterogeneity, as defined above). It measures both homogeneity and heterogeneity of a cluster by one number that falls between zero (when the mean of observations of the cluster equals the reference) and one (when all observations within the cluster are the same). Comparisons are made between the aggregation index, L, and the local Moran's I, local G statistics and Moran scatterplot used to explore different nature of local spatial clusters. |
W. Kyle Little | D. Coleman | Integration of Computer Simulations with Military Command and Control Military organizations have employed automated command and control systems and also computer based simulations for a number of years. These two types of system share many common characteristics; one of which is the processing of large amounts of spatially referenced data. However, these systems are typically developed and used in isolation from each other. One area of current research is directed towards integrating these systems in order to provide more effective training for military commanders at all levels. This report identifies the need for integration of command and control systems with simulations. The possible approaches to implement this integration are examined in terms of their advantages and disadvantages. Examples of current research and development in the United States, the United Kingdom and Canada are reviewed. Finally, a project to integrate computer based simulations with automated command and control systems currently in service with the Canadian Army will be introduced. |
José-Vicente Martinez | L. Mayer | Multibeam Sonar Applications: Mapping Borthic Habitat In the last decade there has been a significant increase in the utilization of shallow-water multibeam sonar systems (MBSS). The reasons for this are the technological advances in sonar design, computer performance, and ancillary instrumentation. The detailed seafloor data products created with MBSS (bathymetric DEMs and acoustic backscatter imagery) are subject of analysis by various scientific disciplines, in particular fisheries research and marine ecology. Full-bottom coverage bathymetry and acoustic backscatter provides a description of the spatial distribution of seafloor relief, composition, and texture. Depth, relief and bottom types are important physical variables governing the distribution of benthic biological resources. It is proposed the application and analysis of multibeam sonar data for the characterization of benthic habitats and biotopes. GIS spatial analysis tools are used to classify the seafloor surface based on depth, slope, and backscatter strength as separate mapping units. The combination of mapping units through boolean operations are used to create maps estimating the distribution of physical habitats. Alternatively, backscatter classes could be validated into seafloor types by correlating them against sediment properties from sampling stations. Finally, the maps of physical habitats are used in bivariate and multivariate analysis along with benthic biological data. The ultimate goal is the generation of spatial distribution maps of marine habitats and biotopes based on the environmental variables determined with MBSS. The study is conducted in the coastal margins of Santa Monica Bay, California where multibeam sonar surveys have been conducted by the U.S. Geological Survey in collaboration with the Ocean Mapping Group. |
Kevin Pegler | D. Coleman | TIN Random Densification for the Removal of Ridging Effects from Digital Terrain Model Data Files The province of New Brunswick began a systematic program of province-wide Digital Terrain Model (DTM) coverage in the late-1980's. Using 1:35,000-scale aerial photography, the DTMs were collected photogrammetrically as a series of profiles spaced 70 metres apart. No regard was given to breaklines along roads or water bodies. The DTMs have gone through a series of stringent quality control checks to eliminate blunders and ensure the elevations of data points re blunder-free and all fall within specified accuracy tolerances. However, users of these DTM's have continued to express concern over perceived data quality based on the evidence of a regular "ridging" effect within many of the DTM files when viewed under certain conditions. Aware of similar phenomena found in DTMs produced by other organizations, Service New Brunswick (SNB) commissioned researchers in the Department of Geodesy and Geomatics Engineering at the University of New Brunswick to investigate the respective requirements and potential costs existing alternatives for batch processing of the DTMs to remove this ridging effect. This paper presents the results of research examining a specific compensation approach: TIN Random Densification. This process is employed in removal of the ridging phenomena found in SNB's Enhanced Topographic Database DTM data. Further, the factors that cause ridging will be discussed from both "data limitation" and "structuring and visualization" perspectives. Finally, the respective strengths and weaknesses of the TIN Random Densification process will be presented. |
Kutalmis Saylam | D. Coleman S. Nichols |
Data Integration within the Canadian Geospatial Data Infrastructure An extensive amount of work has been spent by the Inter-Agency Committee on Geomatics (IACG) and the Canadian Council on Geomatics (CCOG) concerning the Canadian Geospatial Data Infrastructure (CGDI) concept since 1996. IACG and CCOG have been organizing workshops and meetings at the provincial, federal and national level. A number of issues have been considered and pointed out as the critical components of the nation's need for a geospatial data infrastructure. The GeoConnections Secretariat has been formed to coordinate the activities and facilitate the communication on the organizational and user level in 1997. CGDI, as a concept, has five main thrusts. The Framework Data component is critical for accessing, retrieving, maintaining, and geo-referencing the data. Through the framework, geospatial data should be integrated (e.g., horizontally and vertically) to form the national consistency, application development, and for more specific detailed data collection. Furthermore, specific tools and services are needed to complement data discovery and access (e.g., visualization and generalization). This seminar paper gives a - very brief - explanation about the CGDI concept and the five thrusts, and explores the data integration issues as seen from the framework data component of the CGDI. The paper identifies the issues and challenges faced with the data alignment and integration throughout nation, at both federal and provincial level. |