Accuracy Assessment in Image Classification on GEE

This post covers accuracy assessment on a sentinel image in the Google Earth Engine.

  1. Selection of the area of interest and the classification classes

This is done using the geometry tool as shown below. One can use the point tool or the polygon tool in the selection of the classes.

  1. Selection of the Image collection

The image is selected from the sentinel image collection, and then filtered for clouds to get an image with the least cloud percentage. The date of acquisition, as well as the bounds to the area of interest, is also filtered. The output is as shown below.

1. var collection = ee.ImageCollection('COPERNICUS/S2') // searches all sentinel 2 imagery pixels...  
2. .filter("CLOUDY_PIXEL_PERCENTAGE", 10)) // ...filters on the metadata for pixels less than 10% cloud  
3. .filterDate('2017-03-30' ,'2017-07-30') //... chooses only pixels between the dates you define here  
4. .filterBounds(aoi) // ... that are within your aoi  
5.  print (collection) // this generates a JSON list of the images (and their metadata) which the filters found in the right-hand window.  
6.  /// so far this is finding all the images in the collection which meets the critera- the latest on top. To get a nice blended-looking mosaic,   
7.  // try some of the tools for 'reducing' these to one pixel (or bands of pixels in a layer stack).   
8.  var medianpixels = collection.median () // This finds the median value of all the pixels which meet the criteria.   
10. var medianpixelsmedianpixelsclipped = medianpixels.clip(aoi).divide(10000) // this cuts up the result so that it fits neatly into your aoi  
11.	                                                                  // and divides so that values between 0 and 1    
  1. Visualization

Bands 8, 4, and 3 are used in the visualization of the image in order to display it as a natural color image.

12. // Now visualize the mosaic as a natural color image.   
13. addLayer(medianpixelsclipped, {bands: ['B8', 'B4', 'B3'], min: 0, max: 1, gamma: 1.4, }, 'Sentinel_2 mosaic')
  1. Centering

This is applied in order to center the map to the area of interest.

14. Map.centerObject(aoi);  
15. //Map.addLayer(image,truecolor,'Composite'); 
  1. Merging of the classes

 The land cover classes are merged into a collection.

16. var newfc =  forested_land.merge(urbanized_land).merge(bare_land).merge(maize).merge(millet).merge(grassland).merge(shrubland);

17. var bands = ['B2','B3','B4','B5','B6','B8','B9'];

18. var points ={
   collection: newfc,
   properties: ['landcover'],
   scale : 10
  1. Division of the data into training data and validation data

The data is sub-divided into two, such that seventy percent of the data is used for training, while thirty percent is used for validation.

var training = points.filter("random",0.7));
var validation = points.filter(ee.Filter.gte('random',0.7));


var classifier = ee.Classifier.smileCart().train({
  features: training,
  classProperty: 'landcover',
  inputProperties: bands

var classified =;

Map.addLayer(classified,{min:0,max:6,palette:['145A32', 'F43E97','D7DBDD','F4D03F','784212','15CDEA','C32A15']},"Classified");

Patience Kori

A GIS enthusiast keen to use location analytics, web mapping, web design, and web development to find solutions to the day-to-day challenges in the world. An avid reader especially in the growing trends of machine learning, the internet of things, and artificial intelligence.

Previous Post
Next Post

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.