Category Archives: Algorithms

New article on sinkhole detection published in Geomorphology

My new peer-reviewed article titled “Automated delineation of karst sinkholes from LiDAR-derived digital elevation models” has been published in the latest issue of Geomorphology. You can download a free online copy using this link:,Oh6mAl8 (expires on July 8, 2016). In this paper, we present a localized contour tree method for automated extraction of sinkholes in karst landscapes. The study area was Fillmore County in southeastern Minnesota, USA. See some figures below:


Fig. 1. Distribution of sinkhole inventory points in Fillmore County, Minnesota, USA.


Fig. 3. Contour representation of a compound surface depression. (a) Contours overlain on DEM shaded relief. (b) Elevation profile of the transect A–B shown in (a).


Fig. 8. LiDAR DEM shaded relief (a) and examples of extracted sinkhole boundaries overlain on LiDAR DEM shaded relief (b) and color infrared aerial imagery (c).


Fig 9. Sinkhole boundaries delineated using different methods. (a) The sink-filling method. (b) The localized contour tree method.

R script for updating student grades on Blackboard

This might be of interest to some of you teaching large enrollment courses and using scantrons for quizzes/exams. I developed a script using R programming language to automatically extract scores from ITS test scoring results and upload the grades to Blackboard.

The script needs two CSV format input files: the student info file from Blackboard (Full Grade Center -> Work Offline – Download) and the ITS test scoring results (convert the Excel file to CSV). It takes less than one second to get the results.

Feel free to let me know if you have any questions.


BBfile <- file.choose()  #”roster.csv”    ### The file downloaded from Blackboard
ITSfile <- file.choose()   #”result.csv”   ### The file received from ITS scantron results

# BBfile <- “roster.csv”
# ITSfile <- “result.csv”
output <- “score.csv”
scale.factor <- 1  ### scale factor multiplied by the scantron results.
### Extract students’ fullname from Blackboard roster
roster <- read.csv(BBfile,header = TRUE,stringsAsFactors = FALSE)
roster$firstname = as.character(lapply(strsplit(as.character(roster$First.Name), split=” “), “[“, 1))
roster$fullname <- tolower(paste(roster$Last.Name,roster$firstname,sep=””))
### read the ITS results
df <- read.csv(ITSfile,stringsAsFactors = FALSE)
df <- df[nchar(gsub(” “,””,df$X))>0,]
df <- df[!$X.5)),c(“X”,”X.2″)]
colnames(df) <- c(“Name”,”Score”)
df$Score <- as.numeric(df$Score) * scale.factor
### extract student names from ITS results

lastname <- as.character(lapply(strsplit(as.character(df$Name), split=” “), “[“, 1))
firstname <- as.character(lapply(strsplit(as.character(df$Name), split=” “), “[“, 2))
df$fullname  <- tolower(paste(lastname,firstname,sep = “”))
### match student names from Blackboard and ITS
m.x <- merge(roster,df,by = “fullname”,all.x = TRUE)
m.x$raw <- m.x$Score / scale.factor
### save the results to csv file
write.csv(m.x,output,na = “”,row.names = FALSE)
m.y <- merge(roster,df,by = “fullname”,all.x = TRUE,all.y = TRUE)
m.y.sub <- m.y[$Last.Name), ]
score <- read.csv(output,header = TRUE,stringsAsFactors = FALSE)


Analyzing 1.1 Billion NYC Taxi and Uber Trips

I just came across an interesting article: Analyzing 1.1 Billion NYC Taxi and Uber Trips, with a Vengeance. – An open-source exploration of the city’s neighborhoods, nightlife, airport traffic, and more, through the lens of publicly available taxi and Uber data.

Quoted from the author Todd W. Schneider :

“The New York City Taxi & Limousine Commission has released a staggeringly detailed historical dataset covering over 1.1 billion individual taxi trips in the city from January 2009 through June 2015. Taken as a whole, the detailed trip-level data is more than just a vast list of taxi pickup and drop off coordinates: it’s a story of New York. How bad is the rush hour traffic from Midtown to JFK? Where does the Bridge and Tunnel crowd hang out on Saturday nights? What time do investment bankers get to work? How has Uber changed the landscape for taxis? And could Bruce Willis and Samuel L. Jackson have made it from 72nd and Broadway to Wall Street in less than 30 minutes? The dataset addresses all of these questions and many more.

I mapped the coordinates of every trip to local census tracts and neighborhoods, then set about in an attempt to extract stories and meaning from the data. This post covers a lot, but for those who want to pursue more analysis on their own: everything in this post—the data, software, and code—is freely available. Full instructions to download and analyze the data for yourself are available on GitHub.”




New article on surface depression published in IJGIS

My new peer-reviewed article titled “A localized contour tree method for deriving geometric and topological properties of complex surface depressions based on high-resolution topographical data”  has been published in the latest issue of International Journal of Geographical Information Science, Issue 12. You can download a free online copy using this link:  In this paper, we developed a localized contour tree method that is able to fully exploit high-resolution topographical data for detecting, delineating and characterizing surface depressions across scales with a multitude of geometric and topological properties. See some figures below:





Whitebox GAT’s new website and other developments

Check out Prof. John Lindsay’s new Whitebox GAT website and the GoSpatial. A lot of cool and innovative stuff!!

“GoSpatial is a command-line interface program for analyzing and manipulating geospatial data. It has been developed using the Go programming language and is compiled to native code. The project is experimental and is intended to provide additional analytical support for the Whitebox Geospatial Analysis Tools open-source GIS software. GoSpatial can however be run completely independent of any other software and is run from a single self-contained executable file.”

Whitebox Geospatial Analysis Tools

There are a few exciting announcements related to new developments on the Whitebox GAT front. The first is that due to changes in the Google Code practices (it has become ‘read only’ and is no longer accepting new code commits), the Whitebox project has moved its source code repository to GitHub. I think that this will eventually make for improved source code management, although there may be some initial transition issues that we’ll need to work past. Some of the documentation will have to be updated to reflect this change.

The second announcement, which I am most excited about, is that I have finally found the time to update the Whitebox GAT website. There is a fresh new and more professional look to the site. I hope you enjoy the new webpage and as always, feedback is welcome. What would you like to see added or changed?


View original post 88 more words

R-bridge for ArcGIS


R-bridge for ArcGIS:


Today at the Esri User Conference in San Diego, Esri announced a new initiative to build a collaborative community for R and ArcGIS users.

Esri has been teaching and promoting integration with R at the User Conference and Developer Summit for several years. During this time we have seen significant increase in interest, and received useful feedback from our ArcGIS users and R users about a variety of needs and techniques for integrating ArcGIS and R. Based upon this feedback, we are working with ArcGIS and R users to develop a community to promote learning, sharing, and collaboration. This community will include a repository of free, open source, R scripts, geoprocessing tools, and tutorials.

I recently sat down with Steve Kopp, Senior Product Engineer on the spatial analysis team, and Dawn Wright, Esri’s Chief Scientist to talk about what this focus on building a bridge to the R community means for ArcGIS users and other users of R.

Matt Artz:
What is R?

Steve Kopp: R (aka the R Project for Statistical Computing) is an extremely popular and the fastest growing environment for statistical computing. In addition to the core R software, it includes a more than 6,000 community-contributed packages for solving a wide range of statistical problems, including a variety of spatial statistical data analysis methods.

Dawn Wright:  R is widely used by environmental scientists of all stripes, as well as statisticians. Since R has limited data management and mapping capabilities, many of our users find good synergy in using R and ArcGIS together.

Matt Artz:
Does the ArcGIS community use R today?

Steve Kopp: Yes, R has become very popular in the ArcGIS community over the last several years.  Many in our user community have been asking for a mix of its functionality with our own, as well as better code-sharing interaction with the R community.

Dawn Wright: A great example from the marine ecology community is Duke University’s Marine Geospatial Ecology Tools, where they have already long since moved forward with integrating R and ArcGIS for Desktop for some time.

Matt Artz:
What is the R – ArcGIS Bridge?

Steve Kopp: This is a free, open source R package which allows ArcGIS and R to dynamically access data without creating intermediate files on disk.

Matt Artz:
Why did Esri build the R – ArcGIS Bridge? 

Steve Kopp: It was built for three reasons: to improve the performance and scalability of projects which combine R and ArcGIS; to create a developer experience that was simple and familiar to the R user; and to enable an end-user experience that is familiar to the ArcGIS user.

Dawn Wright: The bottom line is that this project is about helping our user community and the R user community to be more successful combining these technologies.

Matt Artz:
So is this initiative just some software code?

Steve Kopp: No, the R – ArcGIS Bridge is simply some enabling technology, the real effort and value of the R – ArcGIS Community initiative will be in the development and sharing of useful tools, tutorials, and tips. It’s a free, open source library which makes it fast and easy to move data between ArcGIS and R, and additional work which makes it possible to run an R script from an ArcGIS geoprocessing tool.

Dawn Wright: This community will be important and useful for R users who need to access ArcGIS data, for ArcGIS users who need to access R analysis capabilities from ArcGIS, and for developers who are familiar with both ArcGIS and R who want to build integrated tools or applications to share with the community.

Steve Kopp: The community of tools will be user developed and user driven. Esri will develop a few sample toolboxes and tutorials, but our primary interest is to facilitate the community and help them build what they find useful.

Matt Artz:
How do you see the ArcGIS community using the R – ArcGIS Bridge?  What does it give them that they don’t have today? 

Steve Kopp: The R – ArcGIS Bridge allows developers with experience with R and ArcGIS to create custom tools and toolboxes that integrate ArcGIS and R, both for their own use, and for building toolboxes to share with others both within their organization and with other ArcGIS users.

Dawn Wright: R developers can quickly access ArcGIS datasets from within R, save R results back to ArcGIS datasets and tables, and easily convert between ArcGIS datasets and their equivalent representations in R.

Steve Kopp: It allows our users to integrate R into their workflows, without necessarily learning the R programming language directly.

Matt Artz:
What about the R user who doesn’t use ArcGIS?

Steve Kopp: It’s not uncommon in an organization for a non-GIS person to need to be able to work with GIS data; for these people, they will be able to use the bridge to directly access ArcGIS data without creating intermediate shapefiles or tables, and without needing to know any ArcGIS.

Matt Artz:
How can people start using the R Bridge? 

Steve Kopp:  The R – ArcGIS community samples, tutorials, and bridge are all part of a public GitHub community site similar to other Esri open source projects. And if you happen to be at the Esri User Conference in San Diego this week, this project will be discussed as part of a workshop on Wednesday.

– See more at:


New IJGIS Paper Published / New Position Beginning August 2015

I am pleased to announce that our paper titled “A localized contour tree method for deriving geometric and topological properties of complex surface depressions based on high-resolution topographical data”  has been published online in the International Journal of Geographical Information Science.

In addition, I am also happy to announce that I have successfully defended my Ph.D. dissertation and I will be joining the Department of Geography at State University of New York at Binghamton (SUNY-Binghamton) as a tenure-track assistant professor beginning August 2015.


Making the most detailed tweet map ever | Mapbox

An interesting article on geotagged tweets with open-source tools!

Making the most detailed tweet map ever | Mapbox.

New peer-reviewed article on wetland classification

My co-authored article on wetland classification just got published online:

Improved Wetland Classification Using Eight-Band High Resolution Satellite Imagery and a Hybrid Approach

Charles R. Lane 1,* , Hongxing Liu 2,3, Bradley C. Autrey 1, Oleg A. Anenkhonov 4, Victor V. Chepinoga 5,6 and Qiusheng Wu 2,3

Abstract: Although remote sensing technology has long been used in wetland inventory and monitoring, the accuracy and detail level of wetland maps derived with moderate resolution imagery and traditional techniques have been limited and often unsatisfactory. We explored and evaluated the utility of a newly launched high-resolution, eight-band satellite system (Worldview-2; WV2) for identifying and classifying freshwater deltaic wetland vegetation and aquatic habitats in the Selenga River Delta of Lake Baikal, Russia, using a hybrid approach and a novel application of Indicator Species Analysis (ISA). We achieved an overall classification accuracy of 86.5% (Kappa coefficient: 0.85) for 22 classes of aquatic and wetland habitats and found that additional metrics, such as the Normalized Difference Vegetation Index and image texture, were valuable for improving the overall classification accuracy and particularly for discriminating among certain habitat classes. Our analysis demonstrated that including WV2’s four spectral bands from parts of the spectrum less commonly used in remote sensing analyses, along with the more traditional bandwidths, contributed to the increase in the overall classification accuracy by ~4% overall, but with considerable increases in our ability to discriminate certain communities. The coastal band improved differentiating open water and aquatic (i.e., vegetated) habitats, and the yellow, red-edge, and near-infrared 2 bands improved discrimination among different vegetated aquatic and terrestrial habitats. The use of ISA provided statistical rigor in developing associations between spectral classes and field-based data. Our analyses demonstrated the utility of a hybrid approach and the benefit of additional bands and metrics in providing the first spatially explicit mapping of a large and heterogeneous wetland system.

New peer-reviewed article on vernal pool detection

An Effective Method for Detecting Potential Woodland Vernal Pools Using High-Resolution LiDAR Data and Aerial Imagery

Qiusheng Wu 1,3,* , Charles Lane 2 and Hongxing Liu 3

Abstract: Effective conservation of woodland vernal pools—important components of regional amphibian diversity and ecosystem services—depends on locating and mapping these pools accurately. Current methods for identifying potential vernal pools are primarily based on visual interpretation and digitization of aerial photographs, with variable accuracy and low repeatability. In this paper, we present an effective and efficient method for detecting and mapping potential vernal pools using stochastic depression analysis with additional geospatial analysis. Our method was designed to take advantage of high-resolution light detection and ranging (LiDAR) data, which are becoming increasingly available, though not yet frequently employed in vernal pool studies. We successfully detected more than 2000 potential vernal pools in a ~150 km2 study area in eastern Massachusetts. The accuracy assessment in our study indicated that the commission rates ranged from 2.5% to 6.0%, while the proxy omission rate was 8.2%, rates that are much lower than reported errors of previous vernal pool studies conducted in the northeastern United States. One significant advantage of our semi-automated approach for vernal pool identification is that it may reduce inconsistencies and alleviate repeatability concerns associated with manual photointerpretation methods. Another strength of our strategy is that, in addition to detecting the point-based vernal pool locations for the inventory, the boundaries of vernal pools can be extracted as polygon features to characterize their geometric properties, which are not available in the current statewide vernal pool databases in Massachusetts.
Alex Tereshenkov

Programming and managing GIS

REDD+ for the Guiana Shield

Technical Cooperation Project

Dr. Qiusheng Wu @ University of Tennessee

Writing Science

How to write papers that get cited and proposals that get funded

GIS In Ecology

Providing Training, Advice And Consultation On The Use Of GIS In Ecology


On cities, land, ...

Scientia Plus Conscientia

Thoughts on Science and Nature


Learning hydrology with R

Karl Hennermann

GIS at the University of Manchester

GIS and Science

Applications of geospatial technology for scientific research and understanding.

Whitebox Geospatial Analysis Tools

Open-source GIS development and spatial analysis with Whitebox GAT


MATLAB-based software for topographic analysis

Anything Geospatial

Dr. Qiusheng Wu @ University of Tennessee

Dr. Qiusheng Wu @ University of Tennessee

Another GIS Blog

Dr. Qiusheng Wu @ University of Tennessee

ArcPy Café

Get all your ArcGIS Python Recipes here!