0% found this document useful (0 votes)
27 views116 pages

gee

The document outlines a comprehensive course on Google Earth Engine, focusing on applied remote sensing techniques. It includes modules covering Earth Engine basics, intermediate concepts, supervised classification, change detection, app development, and the Python API. Participants will gain practical skills through hands-on exercises and projects, enabling them to effectively utilize the platform for remote sensing projects.

Uploaded by

Hiếu Hà Trung
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
27 views116 pages

gee

The document outlines a comprehensive course on Google Earth Engine, focusing on applied remote sensing techniques. It includes modules covering Earth Engine basics, intermediate concepts, supervised classification, change detection, app development, and the Python API. Participants will gain practical skills through hands-on exercises and projects, enabling them to effectively utilize the platform for remote sensing projects.

Uploaded by

Hiếu Hà Trung
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 116
4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) End-to-End Google Earth Engine (Full Course) A hands-on introduction to applied remote sensing using Google Earth Engine. Ujaval Gandhi + Introduction ‘+ Sign-up for Google Earth Engine + Complete the Class Pre-Work + Getthe Course Videos © YouTube © Vimeo + Getthe Course Materials ‘+ Module 1: Earth Engine Basics © 01.HelloWerld 1 Exercise * Saving Your Work © 02. Working with Image Collections = Exercise © 03.Filtering image Collections = Exercise © 04.Creating Mosaics and Composites from ImageCollections = Exercise © 05. Working with Feature Collections = Exercise © 06.Importing Data = Exercise © 07.Clipping Images © Exercise © 08. Exporting Data = Exercise © Assignment 4 ‘+ Module 2: Earth Engine Intermediate © 01.Earth Engine Objects Exercise © 02.Caleulating Indices = Exercise © 03, Computation on ImageCollections = Exercise © 04,Cloud Masking = Exercise © 05, Reducers » Exercise © 06, Time-Series Charts = Exercise © Assignment 2 ‘+ Module 3: Supervised Classification © Introduction to Machine Learning and Supervised Classification © 01. Basic Supervised Classification = Exercise © 02. Accuracy Assessment = Exercise © 03, Improving the Classification = Exercise © 04. Exporting Classifcation Results hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton ame 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) = Exercise © 05.Caleulating Area Exercise © Assignment 3 ‘+ Module 4: Change Detection © Introduction to Change Detection © 01. Spectral Index Change = Exercise © 02. Spectral Distance Change © Exercise © 03, Direct Classification of Change = Exercise © 04, Post-classfcation Comparison * Exercise ‘+ Module 5: Earth Engine Apos © 01.Clientvs. Server = Exercise © 02,Using UI Elements = Exercise © 03, Building and Publishing an App = Exercise © 04. Publishing the App = Exercise © 05.Create.a Split Panel App = Exercise ‘+ Module 6: Google Earth Engine Python API © Introduction tothe Python APL * Google Colab = Local Development Environment © 01. Python APISyntax = Exercise © (02, Automatic Conversion of Javascript Code to Python = Exercise © 03,Batch Exports = Exercise © 04,Using Earth Engine with XArray = Exercise © 05, Automating Downloads © 06, Automating Exports © 07. Using the Google Earth Engine QGIS Plugin + Supplement + Guided Projects © Getthe Code © Project 1: Calculating Rainfall Deviation Project 2: Flood Mapping Project 3: Extracting Time-Series © Project 4: LandCover Analysis © Project 5: Extracting Nighttime Lights Statistics + Learning Resources ‘+ Useful Public Repositories ‘+ Debueging Errors and Scaling Your Analysis, + Data Credits + References © Composites © Supervised Classification © Time-Series Classification + Ucense + Citingand Referencing hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton ante 4128125, 3:34 PM End-to-End Google Earth Engine (Full Course) SpatialThoughts ntps:fcourses.spatialthoughts.comfend-to-end-gee.himiinoducion 36 412825, 3:34 PM End-to-End Google Earth Engine (Full Course) Introduction Google Earth Engine a cloud-ased platform that enables large-scale processing of satellite imagery to detect changes, map trends, and {quantify difterences on the Earth’s surface. This course covers the full range of topis in Earth Engine to give the participants practical skills tomaster the platform and implement their remote sensing projects. eet ae End-to-End v=IGuCnut I2qw8list=PLopGmFLNQ1HJulb7qMiklv 1HIEQhyShasindex=1) Watch he Video - htps/woneyutube com/atch?y=iGuC rut 2qutlst-PLopGmFUNQtHJlb7aMikiv11HiEChyGhaKinder-2) Access the Presentation» (htps/dcs google com/pesentation4/1g8HRDTagCQEpGHmiBIGOT7APLTCwRigahrvTmoVEledt? uspesharing) Sign-up for Google Earth Engine lf you already have a Google Earth Engine account, you can skip this step. Visit our GEE Sign-Up Guide (gee-sign-up.html) for step-by-step instructions, Complete the Class Pre-Work This class needs about 2-hours of pre-work. Please watch the following videos to get a good understanding of remote sensing and hovr Earth Engine works. + Introduction to Remote Sensing (hitps://wrmwyoutube.com/wateh’ ‘concepts, terminology and techniques. ‘+ Introduction to Google Earth Engine (https:/enmwyoutube.com/watch?v=kpincBHZBto): This video gives a broad overview of Google Earth Engine with selected case studies and apalication, The ideo also covers the Earth Engine architecture and how tis different than traditional remote sensing software. Get the Course Videos ‘The course is accompanied by a set of videos covering the all the modules. These videos are recorded from our live instructor-led classes and are edited to make them easier to consume for self-study. We have 2 versions ofthe videos: YouTube ‘We have created a YouTube Playlist wth separate videos for each module to enable effective online-learning, Access the YouTube Playlist * (nttosi/wveyoutube.com/playlisttist=PLppGmFLnQIHJulb7aMikWv2HiEQhySha) Vimeo We are also making the module videos available on Vimeo, These videos can be downloades for offline learning. Access the Vimeo Playlist * (nttas:vimeo.comi/showcase/114678282share=copy) Get the Course Materials The course material and exercises are in the form of Earth Engine scripts shared via a code repository. ‘AyNu9HbK&s): This video introduces the remote sensing 1 Click this link (httpsicode.earthengine.google.co.n/?accept repo~users/ujavalgandhi/End-to-End-GEE) to open Google Earth Engine ‘code editor and add the repository to your account. 2.Hf successful, you will have a new repository named users /vjavalganchi /End-to-End-cEE inthe Scripts tabin the Reader section. 43. Verify that your code editor looks ike below ntps:fcourses.spatialthoughts.comfend-to-end-gee.himiinoducion ante 412825, 3:34 PM End-to-End Google Earth Engine (Full Course) I¥ you do not see the repository in the Reader section, click Refresh repository cache buttor ‘Code Editor After Adding the Class Repository your Scripts tab and it will show up. Google Earth Engine seorsiaom and sss Fier erp » owner (14) » Weer (6) » Reader (24) > archive » amples Refresh repository cache ‘There are several slide decks containing useful information and references, You can access ll the presentations used in the course from the links below. Introduction and course overview [Presentation * (nttps:/idocs google.com/presentation/d/1q8HRDTagQEp3Hmi8IGOT7d)PLTC1wRiga}XewFTmoVE/editusp=sharing)] Map/Reduce Programming Concepts [Presentation » (ntips:/idocs google.com/presentation/d/10qO yxhubkwnsAVjnIWS4E TgwUHq3DXYKoSHGbéGemi0/ediusp=sharing)] Introduction to Machine Learning & Supervised Classification [Presentation 7 (nttps/idocs google.com/presentation/d/19L 1bSvsxb38xSBGIHNKO}vPZ01GqDhv9368 tbtMELSw/editusp=sharing)] Introduction to Change Detection [Presenta (nttps:/idocs google.com/presentation/d/vdFTW361yDuVFbthprumQ8zuMPGwGcHpHsBTRgo,oSl/edit?us Introduction to Earth Engine Apps [Presentation 7 (nttps:/idocs google.com/presentation/d/1u4Q910qT9_OS4m1OWMm3uRUgu_oseqDUxHV-8mpzGz4/edit2usp-sharing)] Introduction to Google Earth Engine Python AP! [Presentation * (nttpsfidocs google.com/presentation/d/1hPVRnxp2Vp1VHXBLu36SH_UtEO|Pz70KeDV-zGlin3U/editAis raring) ring ntps:fcourses.spatialthoughts.comfend-to-end-gee.himiinoducion ste 412825, 3:34 PM End-to-End Google Earth Engine (Full Course) Module 1: Earth Engine Basics Module 1 is designed to give you basic skills to be able to find datasets you need for your project filter them to your region of interest, apply basic processing and export the results. Mastering this wil allow you to start using Earth Engine for your project quickly and save a ot of time pre-processing the data, Pecan (eee ee \vedaYhnVVIpJUBAst=PLopGmFLhQi1H Julb7qMikiv1 1HIEQhyha&index=2) Watch the Video »(httpsd/iwanyoutube comatchiv=daYhxVVIpJUBIst=P ppGmFLAQaHJulb7aMikW i HiEQhyahaSindex=2) 01. Hello World This scriptintrocuces the basic Javascript syntax andthe video covers the programming concepts you need to learn when using Earth Engine. To learn more, visit introduction to JavaScript for Earth Engine (https//developers google com/earth engine/tutorials/tutorialjs_01) section of the Earth Engine User Guide. The Code Eaitoris an Integrated Development Environment (IDE) for Earth Engine Javascript API. It offers an easy way to type, debug, run ‘and manage code. Type the code below and click Run to executeit and see the output inthe Console tab, Tip: You can use the keyboard shortcut Ctr/+Enter to run the code in the Code Editor Google EarthEngine || Sehnacr ma ne sow micas ecu * ttle “oem ene Hello World (Open in Code Editor »(https//code.earthengine.google.co.n/?scriptPath -users%2Fujavalgandh'é2FEnd-to-End-GEE%3AO1-Earth- Engine-Basics%2FO1b Hello. World (complete) ntps:fcourses.spatialthoughts.comfend-to-end-gee.himiinoducion ante 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) print(‘Helle World"); 1 Vartables var city = ‘Bengaluru’; var country = "India"; Print (city, country); var population = 8490088; print (population); Mist vvar majorcities = [‘Munbai', "Delhi", ‘Chennai’, "Kolkata"; print (majorcities); U1 dictionary var cityoata = city": city, population’: s4e00ee, elevation’: 93@ by print (cStybata); U1 Function var greet = function(nane) ( return ‘Hello * + rane; U1 this 15 @ coment Exercise Try in Code Editor » (nttps/icode.earthengine google.coin/?scriptPat Basies%2FOtc Hello World, (exercise) 1sers%2Fujavalgandhite2FEnd-to-End-GEE%9A01-Earth-Engine- // These are the 5 Largest cities in the wortd // Tokyo, Delhi, Shanghat, Mexico City, Sao Paulo // Create @ List named “LorgeCities 71 The List should have nanes of all the above cities 1 Print the List Saving Your Work When you modi any script forthe course repository you may want to save a copy for yourself If you try to click the Save button, you will get an error message Ike below Save a copy = - This is because the shared class repository is a Read-only repository. You can click Yes to save acopy in your repository this sthe frst time you are using Earth Engine, ou will be prompted to choose a Earth Engine username. Choose the name carefully, as it cannot be changed once created. hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton 76 412825, 3:34 PM End-to-End Google Earth Engine (Full Course) Choose an Earth Engine username Choose a username to help others identify your Earth Engine work like scripts, apps, and assets. The usemame you choose will be used to create your asset and script home folders. Note that when you share your work the username wll be Visible to other users, You eannot change the name of your username or home folder after i's created Your username willbe linked to your signedtin account permanently. We recommend that you set ito an online name ‘that you want 10 be visible tothe Earth Engine community, such ‘as your email username or sacal media handle. CANCEL OK ‘After entering your username, your home folder willbe created, After that, you will be prompted to enter a new repository. Arepository can help you organize and share code. Your account can have multiple repositories and each repository can have multiple scrints inside it. To got started, you can create 2 repository named default Finally, ou wll be able to save the script 02. Working with Image Collections Most datasets in Earth Engine come as 2 inagecollection .AnlmageCollection sa dataset that consists of images takes at differen and locations - usualy rom the same satelite or data provider. You can load a collection by searching the Earth Engine Data Catalog (httos://developers google.com/earth-engine/datasets) forthe ImageCollection ID. Search forthe Sentinel-2 Level 1C dataset and you will find ltsid coPeRNIcUs/s2_sR. Visit the Sentinel-2, Level 1C page (nips: /developers google.com/earth- engine/datasets/catalog/COPERNICUS_S2) and see Explor in Earth Engine section to find the code snippet toload and visualize the callection, This siippet isa great starting point for your work with this dataset. Click the Copy Code Sample button and paste the code into the code editor. Click Run and you will see the image tiles load inthe map. Earth Engine Data Catalog © Sexen om 6 @ Explore in Earth Engine Faneion to sak clouds using the Sntinl-2 04 bond ©, return Snopes neh) vide 080) Inthe code snippet, You willseea function Map.setcerter() which sets the viewport toa specific location and zoom level. The function takes the X coordinate (longitude), ¥ coordinate (latitude) and Zoom Level parameters, Replace the X and Y coordinates withthe coordinates ‘of your city and click Run to see the images of your city. ntps:fcoursesspatilthoughts.comfend-to-end-gee.himiintoducion ante 412825, 3:34 PM End-to-End Google Earth Engine (Full Course) (Google Earth Engine schol mcasanes Exercise Try in Code Editor 7 (httpsi/code-earthengine google.co iin/?sriptPath-users)«2Fujavalgandhi%2FEnd-to-End-GEEX3AO1-Earth-Engine- Basies%2FO2c_Image_ Collections (exercse)) // Find the “Sentinel-2 Level-1C* dataset page 71 etps://developers. google. con/earth-engine/datasets 11 Copy/paste the code snippet U1 Change the code to display images for your howe city 03. Filtering Image Collections ‘The collection contains all imagery ever collected by the sensor. The entre collections are not very useful, Most applications require subset of the images. We use filters to select the appropriate images. There are many types f filter functions, look at ee. Filter... module tose allavailable ters. Select after and then run the #i1ter() function with the filter parameters We will earn about 3 main types of filtering techniques ‘+ Filter by metadata: You can apply afiter on the image metadata using ters such as ee.Filter.eq() , ee.Félter.1¢() etc. You can filter by PATH/ROW values, Orbit number, Cloud cover etc ‘+ Filter by date: You can select images in a particular date range using fiters such as ee.Filter.date() ‘+ Filter by location: You can select the subset of images with a bounding box, location or geometry using the ee. Filter. bounds() . You ‘can also use the drawing tools to draw a geometry for filtering. ‘After applying the filters, you can use the size() function to check how many images match the filters. ntps:fcourses.spatialthoughts.comfend-to-end-gee.himiinoducion ote 412825, 3:34 PM End-to-End Google Earth Engine (Full Course) btn "3 ae. = (Open in Code Eeitor 7 (httos//code.¢arthengine.google.co.in/?scriotPa Engine-Basics32FO3b Filtering_Image_Collection (complete)) var geonetry = ee.Geometry,Point({77.60412933051538, 12.952512912328241]) ap. centerObject(geanetry, 10) var 52 = e¢..nageCol ection( COPERNICUS/S2_ HARMONIZED" ); // Fitter by metadata 2. Filter(ee.Filter-1t("CLOUDY_PIXEL_PERCENTAGE”, 30)); var filtered U1 Fitter by dote var #iltered = 52.filter(oe.Filter.date('219-01-@1", "2620-81-01" )); 1 Fitter by Location var filtered = 52.1 1ter(ee.Filter, bounds geonetry)) 5 // (ets apply all the 3 filters together on the collection // First opply metadata fileter var filtered! = 52.Filter(ee.Filter.1¢(*CLOUDY_PIXEL_PERCENTAGE', 38); U1 tply date filter on the results var Filtered? = filteredt.filter( ce, Filter.date('2819-01-01', '2020-01-81")); // Lastly opply the Location filter var filtered = Filtered2.#ilter(ee.Filter. bounds (geonetry))s // tnsteod of opplying Filters one after the other, we can ‘chain’ them // Use the . notation to opply all the filters together ‘var filtered = 22.Filter(ee.Filter.1¢(“CLOLOY_PIXEL_PERCENTAGE", 3@)) -filter(ee.Filter,date('2019-01-81", '2828-@1-01")) {f2ter(ee. Filter. bounds (geometry) print (Filtered.size()); Exercise ‘Tryin Code Ealtor »(https//eode.earthengine google.co.n/?scriptPath=users%2Fujavalgandhi%62FEnd-to-End-GEEX3A01-Earth-Engine Basies%2FO3c Filtering Image_Collection_(exercise)) ntps:fcoursesspatilthoughts.comfend-to-end-gee.himiintoducion r0m6. 4128125, 3:34 PM. End-to-End Google Earth Engine (Full Course) var geonetry = ee.Geonetry. Point({77-68412933052538, 12,952512912328241]); var 52 = ee. tnageCol lection( ‘COPERNICUS/S2_HARMONTZED' ); var filtered = 52 Filter (oe.Filter,18(*CLOUDY_PIXEL_PERCENTAGE", 38) ilter(ce.Filter.date("2619-81-01", '2028-81-61')) -filter(ee.Filter,bounds geometry): print (filtered.size())s U1 exercise U1 velete the ‘geonetry’ variable U1 Add a point at your chasen Location 1 Change the filter to find images from January 2623 // ote: If you are in a very cloudy region, // wake sure to adjust the CLOUDY _PIXEL_PERCENTAGE 04. Creating Mosaics and Composites from ImageCollections The default order of the collection is by date. So when you display the collection, iLimplictly ereates a mosaic with the latest pixels on top, You cancall .rosaic() ona lmageCollection to create a mosaic image from the pixels atthe top. Weecan also create acomposite image by applying selection crt function to create acomposite where each pixel values the me a to each pixel from all pixels in the stack, Here weuse the nedian() no al pixels from the stack Tip: If you need to create a mosaic where the images are in a specific order, you can use the .sort() function to sort your collection by a property first. MOSAIC MEDIAN COMPOSITE Mosaic vs. Composite (Open in Code Eslitor 7 (https//code.earthengine google.co.n/?scriptPath=users%2Fujavalgandhi¥s2FEnd-to-End-GEE%9A01-Earth- Engine-Basics%2FO4 Mosaics and_Compasites,(complete)) ntps:fcourses.spatialthoughts.comfend-to-end-gee.himiinoducion se 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var geonetry = e@.Geome var 2 = ee. nagecollec -y Point [77.68412933051538, 12.952912812328241]);, 10n("COPERNICUS/S2_ HARMONIZED" ); var filtered = 52. filter(ee.Filter.1t( CLOLOY_PIXEL_PERCENTAG: #ilter(ee.Filter.date("2019-81-01', '2029-81-@1')) *, 3e)) filter(ee! 'terbounds (geometry) 5 var mosaic « Filtered. nosaie(): var nedianconposite = filtered.nedian(); Map.centerobject(geonetry, 10); var rgbvis nin: 8.8, nox: 3200, bands: ['a", "83", "82"), » Mop.adélayer(Filtered, rgbvis, "Filtered collection’); Map.adéLayer(nosaic, rgbVis, ‘Mosaic’ Map.addlayer(nedianConposite, rgbvis, ‘Median Composite’); Exercise Tryin Code Editor » (nttpsi/code-earthengine google.o iiv?scriptPath-users)2Fujavalgandh!X2FEnd-to-End-GEEN3AO 1-Earth-Engine- Basies%2FO4c_Mosaics_and_Composites_fexercise)) // Create @ wedion composite for the year 2020 ond Load it fo the map // Compare both the composites to see the changes in your city 05. Working with Feature Collections Feature Collections are similar to Image Collections -but they contain Features, not images. They are equivalent to Vector Layers ina GIs We can load, ter and display Feature Collections using similar techniques that we have learned so far. Search for GAUL Second Level Administrative Bounderies and load the collection. This isa glabal collection that contains all Admin2 boundaries. We can apply after using the AoMa_Nekt property to get al Admin2 boundaries (ve Districts) froma state, hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton rams 412825, 3:34 PM End-to-End Google Earth Engine (Full Course) oo (Open in Code Editor »(https//code.earthengine.google.co.n/?scriptPath~users%2Fujavalgandhi%é2FEnd-to-End-GEE%3AO1-Earth- Engine-Basics%2FOSb Feature, Collections (complete) // Use the GAUL 2015 dataset from the GEE Data Catalog Var admin? = ee.FeatureCollection(FA0/GAUL_SIMPLIFIED_S00n/2035/level2"); var Filtered = adnin2.filter(ee.Filter.eq("ADM_NAME’, "Karnataka' )); // Alternatively, use the newer GAUL 2024 dataset from the community catalog // wietps://gee-connunity-catatog.org/projects/qaul/ var adnin2 = ee.FeatureCollection("projects/sat-io/open-datasets/FA0/GAUL/GAUL_2024 12"); var filtered = adnin2. filter(ee.Filter.eq("gaul_nane", *Karnataka")); vvar visParans = (‘color': ‘red"}; op.addLayer(Filtered, visParans, ‘Selected Districts"); Exercise Try in Code Editor » (httpsi/code-earthengine google.co in/?sriptPath-users)«2Fujavalgandhi%2FEnd-to-End-GEEXGAO1-Earth-Engine- Basies%2FOSc_Feature Collections.(exercise)) var adnin2 = e2.Featurecollection( FA0/GAUL_SINPLIFIED_S00n/2015/level2" ); Map.addLayer(adnin2, (color: “grey"}, ‘All Admind Polygons"); U1 exercise 7/ toply Filters to select your chosen Adwin2 region U1 Display the results in ‘red’ color /1 Witt: Switch to the ‘Inspector’ tab ond click on any 71 polygon to know its properties and their values // Wint2: Many countries do nat have unique nares for U1 Bdsin2 regions. Make sure to apply a filter to select // the Adgint region thot contains your chosen Admin? region ntps:fcourses.spatialthoughts.comfend-to-end-gee.himiinoducion ass. 412825, 3:34 PM End-to-End Google Earth Engine (Full Course) 06. Importing Data You can import vector or raster data into Earth Engine. We willow import a shapefile of Urban Centres (https/bitlyighs-ucdb-shapefile) from JRC's GHS Urban Centre Database (GHS-UCD8)-Unzipthe ghs_urban_centers. ip intoa folder on your computer. Inthe Code Editor goto Assets + New— Table Upload-~ Shape Fes Select the .shp, .shx, .4bf and.prj files Enter ghs_urbon_centers as he Aset Name and click Upload. Once the upload and ingest finishes, you will have anew asset inthe Ase tab. The shapeileisimported as Feature Collection in Earth Engine Select the grsurban centers asset and click import. Youcan then visualize the imported dat, Google Earth Engine" daisagg new shopetie asset "= i SE Se —— (pen in Code Editor »(https://eode.earthengine google.co.in/?scrintPath=users%2Fujavalgandhi9¢2FEnd-to-End-GEEXGAO1-Earth- Engine-Basics32FO6b Import (completel) Importing a Shapefile // (ets tnport some data to Farth Engine 1 Upload the 'GHS Urban Centers’ database from IRC // betps://ahsl..jre.ec.europa.eu/ghs_ stat_ucab2015nt_12019a.php // Donntoad the shapefile from hetps://bit. \y/ghs-uedd-shapeFile U1 vnzip and upload U1 Import the collection vvar urban = ee. FeatureCollection( ‘users/ujavalgandhi/e2e/ghs_urbar_centers"); U1 Visualize the collection ap.addLayer(urban, {color: ‘blue"}, ‘Urban Areas"); Exercise Tryin Code Editor » (nttps:/code-earthengine google.o in/?scriptPath-users}2Fujavalgandh!%2FEnd-to-End-GEEN3AO 1-Earth-Engine- Basies%2FO6c. Import (exercise!) ntps:fcourses.spatialthoughts.comfend-to-end-gee.himiinoducion sans: 4128125, 3:34 PM. End-to-End Google Earth Engine (Full Course) var urban = e¢.FeatureCollection( ‘users/ujavalgandhi/e2e/ghs_urban_centers'); print (urban. first())3 U1 Exercise // #eply a Filter to select only Large urban centers 1 in your country ond display it on the mop, // Sevect all urban centers tn your country with 71-2 population greater than 1008600 U1 Mint: Use the property ‘CTRMLAM containing country names // Wint2: Use the property 'P15" containing 2615 Population 07. Clipping Images Itis often desirable to clip the images to your area of interest. You can use the clip() function tomask out an image using a geometry, While in a Desktop software, clipping is desirable to remove unnecessary portion of a large image and save ‘computation time, in Earth Engine clipping can actually increase the computation time. As described in the Earth Engine Coding Best Practices (https://developers.google.com/earth-engine/guides/best_practices? hi-en#if-you-dont-need-to-clip-dont-use-clip) guide, avoid clipping the images or do it at the end of your script. Full image Clipped Image (Original vs. Clipped image (Open in Code Elitor ”(https/fcode.earthengine.google.co.n/?scriptPath ~users%2FujavalgandhiSs2FEnd-to-End-GEE%3AO1-Earth- Engine-Basics%2FO7b Clipping. (completel) ntps:fcourses.spatialthoughts.comfend-to-end-gee.himiinoducion asin. 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var 52 = ee. nageCol lection(‘COPERNICUS/S2_ HARMONIZED" ); ee. FestureCollection( users/ujaval gandhi /e2e/ghs_urban_centers"); var urbar // Find the nane of the urban centre U1 by adding the Layer to the map and using Inspector. var filtered = urban -fllter(ee.Filter.eq("UC_NWLHN, "Bengaluru")) irten(ee.Filter.eq("CTR_MWNM, "India")); var geonetry = filtered. geonetry(); var rgbvis = ¢ ax: 3200, bands: ['84", "83", *82"), bs var filtered = 52. ilter(ee.Filter.1t("CLOUDY_PIXEL_PERCENTAGE”, 30)) irter(ce. Filter. date( 2619-81-01", '2028-81-61')) Filter(ee.Filter bounds geometry) var inagi ‘Filtered nedian(); var clipped = snage.clip(geometry) 5 ap.centerdbject (geonetry); Map.addlayer(eLipped, egbvis, “Clipped; Exercise Try in Code Editor 7 (htps:/code-earthengine google co in/?scriptPath~users%e2Fujavalgandh!%2FEnd-to-End-GEEY3AO 1-Earth-Engine- Basies%2FO7c Clipping. (exercse)) U1 Add the imported table to the Mop // Use the Inspector to Find the 1d of your howe city or any urban area of your choice // Change the filter to use the id of the selected feature 08. Exporting Data Earth Engine allows for exporting both vector and raster data tobe usd in an external program. Vector data canbe exported asa Sv ora Shape le , while Rasters can be exported as GecTTFF files. We will now export the Sentinel-2 Composite as a GeoTIFF fle. Tip: Code Editor supports autocompletion of API functions using the combination Ctrl+Space. Type a few characters of a function and press Ctr/+Space to see autocomplete suggestions. You can also use the same key combination to fill all parameters of the function automatically. (Once you ru ths script, the Tasks tab willbe highlighted. Switch to the tab and you willsee the tasks waiting Click Run next to each task to start the process. hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton rei. 412825, 3:34 PM End-to-End Google Earth Engine (Full Course) Use print(...) to weit image (16 bards) image (3 bands) Search or cancel multiple tas ~~ console, Manoge t32k seu] |Ronoger es. Bi Bangalore composite Raw x Task nie Cn clicking the Run button, you will be prompted for a confirmation dialog. Verify the settings and click Run to start the export. (Once the Export finishes, aGeoTiff ile for each export task will be added to your Google Drive in the specified folder. You can download them and use tin a GIS software. ‘Task: Initiate image export Bangalore Composite Raw) ePsc3857 2 esthengine Dangslore composite raw ntps:fcourses.spatialthoughts.comfend-to-end-gee.himiinoducion ane. 412825, 3:34 PM End-to-End Google Earth Engine (Full Course) Visualized Composite Raw Composite Visualized vs. Raw Composit (pen in Code Editor »(hitps//code.earthengine.google.co.n/?scrintPath=users%2Fujavalgandhits2FEnd-to-End-GEE%SAO1-Earth- Engine-Basics%2FO8 Export, (complete)) ntps:fcourses.spatialthoughts.comfend-to-end-gee.himiinoducion sei. 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var 52 = ee. nageCol ection(‘COPERNICUS/S2_ HARMONIZED" ); var urban = e¢.FeatureCollection(‘users/ujavalgandhi/e2e/ghs_urban_centers'); var filtered = urban Filter(ee.Filter-eq("UC AMM", “Bengaluru’)) sfilter(oe.Filter.eq(*CTRIW_NM, "Tndia"))s var geonetry = #iltered.geosetry(); var rgbvis nin: 8.0, ax: 3200, bands: ['84", "83", °82"), w var filtered = 52. fllter(ee.Filter.1¢(“CLOUDY_PIXEL_PERCENTAGE”, 30)) #ilter(ee.Filter.date(*2819-81-01", '2028-81-61')) -filter(ee. Filter bounds (geometry) 3 var inage = flltesed.median(); var clipped = Amage.clip(geonetry) map centerobject (geonetry); Map.adéLayer(cLipped, rgbVis, ‘Clipped; var exportTeage = cLipped.select(*8.*"); /1 fxport raw nage with original pixel values Export. image toorive({ nage: exportinage, description: ‘Bangalore composite Raw’ , folder: “earthengine’, ‘#lewanePretix: "bangalore _conposite_raw’, region: geonetry, scale: 18, naxPixels: 108 Ds U1 Export visualized image os colorized RGB image // Rather than exporting raw bands, we can apply a rendered tnage // visuatize() function allows you to apply the sane paraneters // that are used in earth engine which exports 0 3-band ROB inage /1 Note: visualized images are not suitable for analysis, var visualized = clipped.visualize(rgbvis); Export. image toorive(t nage: visualized, description: ‘fangolore Composite Visualized’ , folder: ‘earthengine’ ‘FlevanePrefix: “bangalore _corposite_visualized’, region: geometry, scale: 18, naxPixels: 169 ve Exercise Try in Code Editor - (https:fcode-earthengine.goosle. 1 Map the function over the coltection var withNdvi = filteredS2.nap adaHOVE) ; var composite = withnévimedian(); var ndviconposite = composite. select(‘névi"); var palette = [ FEFFFF', ‘'CE7EAS', ‘DF930", ‘F185S5', “FCD163', "996718", ‘g0scao", “@z3n0n", “@1zce1", ‘or10e3", “@13981"];, var ndvivis = {min:0, max:8.5, palette: palette }s Map.adéLayer(ndviComposite.clip(geonetry), ndviVis, ‘ndvi"); Exercise Try in Code Editor - (https:/code-earthengine.google co nv 2seriptPath Intermediate%2F03c_ Computation on_Image_Collections {exercsel) hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton 1sers%2Fujavalgandhise2FEnd-to-End-GEE%3A02-Earth-Engine- 2st 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var 52 = e¢. nageCol ection( ‘COPERNICUS/S2_HARMONIZED" ); var adnint = e2.Featurecollection(/FA0/GAUL_SINPLIFIED_580n/2035/level1"); // Select an Adwind region var filteredAdnint = adnint.filter(ee.Filter.eq("A0"1_NAME', ‘Karnataka")); var geonetry = filteredAdmint geonetry(); Map.centerdbject (geonetry); var rgbVis = (min: 0.0, max: 3800, bands: ["84", “83°, °82°1}; var filtereds2 = $2. ilter(ee.Filter1t(*CLOUDY_PIXEL PERCENTAGE’, 30)) irter(ce.filter.date("2019-81-01", '2028-81-@1')) -Filter(oe,Filter, bounds geometry) var conposite = lltereds2.nedian(); Map.addlayer(conposite.clip(geanetry), rgoVis, “Admin Composite”); // This function calculates both NOVI and HMI indices U1 ond returns an inage with 2 new bonds added to the original ‘nage, Function addtndices(image) { var ndvi = inage.nonealizedDifference({ "88", '84"]).renaze(‘ndvi'); var ndwi = inage.nonealizedditference([ "83", '88"]).renane("ndwi'); return image.addBands (ndvi) .adéBands rdw); > U1 Map the function over the coltection ‘var withindices = filtereds2.eap(addindices); U1 Composite var conposite = withindices.median(); print (composite); U1 exercise // Bisplay a mop of NOMI for the region // Sevect the ‘ndwi* bond and clip it before displaying U1 Use a color palette from https://colorbrener2.org/ U1 Hints Use .setect() function to select a band 04. Cloud Masking ‘Masking pixels in an image makes those pixels transparent and excludes them from analysis and visualization, To mask an image, we can use the updatetask() function and pass it an image with O and 1 values. All pixels where the mask images O will be masked. Most remote sensing datasets come with a QA or Cloud Mask band that contains the in The script below takes the Sentine!-2 masking function and shows how to apply it on an image. hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton mation on whether pixels is cloudy or not. Your CCode Elita contains pre-defined functions for masking clouds for popular datasets under Scripts Tob ~» Examples Cloud Masking. To Understand how cloud-masking functions work and learn advanced techniques for bitmasking, please refer to our article on Working with (QA Bands and Bitmasksin Google Earth Engine (httpsi/spatialthoughts.com/2021/08/19/qa-bands-bitmasks-gee/), zone. 4128125, 3:34 PM. End-to-End Google Earth Engine (Full Course) Image before Image after Applying pivel-wise QA bitmask (Open in Code Editor »(https//code.earthengine.google.co.n/?scriptPath=users%2Fujavalgandhi}s2FEnd-to-End-GEE%3A02-Earth- Engine-IntermediateX2FO4b Cloud Masking. (complete)) ntps:fcourses.spatialthoughts.comfend-to-end-gee.himiinoducion sorts: 4126125, 3:24 PM var 52 = ee. nageCol lection( ‘COPERNICUS/S2_HARMONIZED" ); var geonetry var filtered? = 42 #ilter(ee.Filter.10 *cLOUBY_PIXEL_PERCENTAGE’, 35)) -filter(ee! 2020-01-01") ilter(ee iter date( "2619-01-01", bounds (geonetry)) ster U1 Sort the collection and pick the most cloudy inage var filteredsasorted = flltereds2.sort({ Property: “CLOUDY_PIXEL, PERCENTAGE" , ascending: False De var inage = filtereds2sorted. first ()s var rabvis 0.0, bands: ['84", b 83", "2"), Map centerobject (Image); Map.addlayer(inage, rgbVis, ‘Full Inage’, false); U1 write a function for Cloud masking function waskS2clouds (image) { var ga = inage.select(QA60"); var cloudditiask = 4 << 10; var cirrusBitnask = 1 << 215 var mask » qa. bitwiseAnd(clovd8itMask) .eq(0) -and( ga. bitwisesnd(cirrusbitnase) .eq(@)); return image.updateask mask) select("8.*") copyProperties(inage, ['systen:tine, start"): ) var inagevasked ~ ee Image(aaskS2clouds(image)); Map.addlayer(inagetasked, rgbvis, ‘Masked Image"); Exercise Tryin Code Editor > (https:/code-earthengine.google.o.in/scriptPat IntermediateX2F04c. Cloud Masking (exercise) hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton End-to-End Google Earth Engine (Full Course) e.Geonetry Point ([77.68432933052538, 12.952912912326241]); tsers%2Fujavalgondhité2FEnd-to-End-GEE%9A02-Earth-Engine= suns 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var 52 = ee. nageCol lection(‘COPERNICUS/S2_ HARMONIZED" ); var geonetry = ee.Geonetry.Point([77-68812933052538, 12.952912912326241]); var filtered? = 2 #ilter(ee.Filter.10 *cLOUDY_PIXEL_PERCENTAGE’, 35)) ‘ter. date("2019-81-81", '2820-01-01')) ms -filter(ee! -filter(ee. Filter. bounds (geone’ U1 Sort the collection and pick the most cloudy inage var filtereds2sorted = flltereds2.sort({ Property: “CLOUDY_PIXEL_ PERCENTAGE" , ascending: False De var inage = filtereds2sorted. first ()s var rabvis rnin: 8.0, ax: 3200, bands: ['94", "63", *82"), b Map centerobject (Image); Map.addlayer(inage, rgbVis, ‘Full Inage’, false); U1 write a function for Cloud masking function waskS2clouds(inage) { var ga = inage.select(QA60"); var cloudtitask = 1 << 10; var cirrusBitnask = 1 << 215 var mask ~ qa. bitwiseAnd(clovd8itMask) .eq(0) -and( ga. bitwisesnd(cirrusbitnase) .eq(@)); return image-updateask mask) select("8.*") copyProperties(inage, ['systen:tine, start"])s ) var inagevasked ~ ce Tnage(naskS2clouds(image)); Mop.addlayer(inagetasked, ngbVis, ‘Masked Inage (QA60 and)’, false); // Use Google's Cloud Score+ Mask var csPlus = ee. TmageCollect1an( GOOG E/CLOUD_SCORE_PLUS/V1/S2_MARMONTZED"); var csPlustands = csPlus. first() .bandNanes (3 // ink 82 and CS+ results. var inagewithcs = inage.LinkCollection(csPlus, esPlusBands); U1 Function to mask pixels with Low CS+ Qt scores. Function maskLowga(inage) { var gaBand = ‘es! var clearThreshold = 0.6) var mask = inage.select qaband) .gte(clearThreshold); return image.updateask mask) ; > vvar inagetaskedCs = ee. Inage(naskLowgAa(inagenithCs)); Map.addlayer(inagemaskedCs, rgovis, “Masked Tnage (Cloud Scoret)"); 11 Google Cloudscoret dataset provides state-of-the-art U1 cloud masks for Sentinel-2 images. // Tt grades each image pixel on continuous scale between @ ond 1, U1 8» ‘not clear” (occluded), hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton sans: 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) U1» ‘clear’ (unoceluded) U1 Bxercise // velete the ‘geonetry’ variable and add a point at your chosen Location // fun the script and compare the results of the CloudScores mask with U1 the 60 mask // Adjust the ‘clearThreshotd" value to suit your scene 17 Wnt: // clearThreshold values between 8.50 and 0.65 generally work well // Wigher values will remove thin clouds, haze & cirrus shadows Learn more about the Cloud Score+ (https://medium.com/google-earth/all-clear-with-cloud-score- bd6ee2e2235e) project. 05. Reducers When writing parallel computing code, a Reduce operation allows you to compute statistics ona large amount of inputs. In Earth Engine, you need to run reduction operation when creating composites, calculating statistics, doing regression analysis etc. The Earth Engine API comes with a large numberof built that can perform a variety of statstical operations on input data. You can run reducersusing the reduce() function. Earth Engine supports running reducers on all data structures that can hold multiple values, such as images (reducers run on different bands), lmageCollection, FeatureCollection List, Dictionary etc. The script below introduces basic concepts related toreducers. in reducer functions (suchas ee.Reducer.sun() , €¢-Reducer.histogran() ,ee.Reducer.2inearFit() etc) (Open in Code Editor »(httos://code.earthengine google.co.in/?scriptPat Engine-IntermediateX2F05b, Reducers, (complete) ssers%2FujavalgandhiXs2FEnd-to-End-GEE%3A02-Earth- hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton sss. 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) 1 Computing stats on a List var myList = e2.List-sequence(2, 18) print (nylist) // Use a reducer to compute average value var mean ~ rylist.reduce(ee.Aeducer.mean()) 5 print (mean); var geonetry = ce.Geonetry.Pelygon({ [s2.60642647743225, 27.16350437805251], [82.60984997613525, 27.1618829901377], [s2.61088967323383, 27.163595288375266], ([82.60757a46289062, 27.16517483230527] Ds var s2 = ee. rnagecollection( ‘COPERNICUS/S2_HARMONTZED' ); Map.centerdbject (geonetry); // taply a reducer on a image collection var filtered ~ £2. fllver(ee.Filter.1¢( CLOUDY PIXEL_PERCENTAGE”, 30)) #iiter(oe Filter. date("2019-@1-01', '2628-81-61')) “filter(ee. Filter. bounds geonetry)) select('B.*"); print (filtered.size()); var collMean = filtered. neduce(ee.Reducer.nean()) print("Reducer on Collection’, collMean); var inage = ce. Tnage(" COPERNICUS/S2/201902237050821_20190223T851829_T44RPR" ); var rgbvis = (min: @.0, max: 3800, bands: (B4", "83°, °82°]}; Map.addtayer(Inage, rgbvis, ‘Inage"); Map.addLayer(geonetry, (color: ‘red"), ‘Farn'); // 3 we wont to compute the average value in each band, // we can use reduceRegion instead var stats = image.reduceRegion({ reducer: ee,Reducer.mean(), geonetry: geonetry, noxPixels! 1618 Ds print (stats); 11 Result of reduceRegion 1s 9 dictionary. // We can extract the values using .get() function Print(‘Aaverage value in 84", stats.get('84")); Exercise Try in Code Editor 7 (https:/code-earthengine.google.o in/scriptPath-usersi2Fujavalgandh!%2FEnd-to-End-GEEN3A02-Earth-Engine- IntermediateX2FO5c. Reducers {exercis}} hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton sans: 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var geonetry = ee.Geometry.Polygon({[ [s2.60642647743225, 27.16350437805251], [s2.60984997613525, 27.1618529901377], {[82.610e967323203, 27.263695288375266], [s2.60757246289062, 27.16517483230977) IDs var egbvis = {min: 0.0, max: 3800, bands: ["B4", ‘33", '82°]}s var inage = ec. nage("'COPERNICUS/S2_HARWONIZED/20290223T@SeBi1_20190223T851829_ Map.addlayer(inage, rgbvis, ‘Inage"); Map.addLayer(geonetry, (color: ‘red"}, ‘Farm'); Map.centerdbject (geonetry); var navi = inage.norealizeddifference({'88", '84"]).renane(‘névi'); U1 exercise 1 Compute the average NOVI for the farm fron the given inage 7/ int: Use the reduceRegion() Function 06. Time-Series Charts Now we can put together all the skills we have learnt so far ~ fil fora given farm over 4 year. Earth Engine API comes with support for charting functions based on the Google Chart API. Here we use the er, map, reduce, and cloud-masking to create a chart of average NDVI values Ul.-chart.mage.series() function to create. time-series chart, ‘Computing NDVI Time-series for a Farm hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton 3516 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) NOVITIme-series showing Dual-Cropping Cycle (Open in Code Eeltor 7 (https//eode.earthengine.google.co.in/?scriptPa Engine-IntermediateX2F04b, Time, Series.Charts. (complete) ssers%2FujavalgandhiX2FEnd-to-End-GEE%9A02-Earth- hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton 366 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var 52 = ee. nageCol ection(‘COPERNICUS/S2_ HARMONIZED" ); var geonetry = ee.Geonetry.Polygon({[ [82.60642647743225, 27.16350437805251), [82.60084997613525, 27.1618529901377], [e2.61088967323303, 27.163695288375266], [82.60757an62a0062, 27.16517483230927) Ds Map.addLayer(geonetry, (color: ‘red"}, ‘Farm'); Map.centerdbject (geonetry); var filtered = 52 filter(ce.filter.date( "2617-01-01", '2018-81-@1')) -filter(oe,Filter,bounds geometry) 1 (oad the Cloud Scores collection var csPlus = e-InageCollection( 'GOOGLE/CLOUD_SCORE_PLUS/V1/S2_HARMONIZED"); ‘var csPlustands = csPlus.first() bandianes (); // We need to add Cloud Score + bands to each Sentinel-2 11 Anage in the collection U1 This 1s done using the LinkCoLLection() function var filtereds2uithcs = filtered. 1inkCollection(csPlus, csPlusBands); U1 Function to mask pixels with Low CS+ Qa scores. Var maskLowga = funetion( mage) { var gadand = ‘cs"; var clearthresheld = 9.5) var mask = nage. select qatand) .gte(clearTheeshold); return image upcatettask mask); by /7/ wap) the function to mask all images var filteredtasked = FilteredS2withes map(naskiont) sselect("B.*"); 17 Function to scale pixel values to reflectance var scalevalues = function(smage) { var scaledinage = inage.nultiply(0.c0e1); // This creates a new image and we Lose the properties U1 copy the system:tine_start property 1 from the originalimage return scaledinage copyProperties(image, ['systen:tine_star Ds » /1 Function that computes NOVI for an image and adds ‘t as a band var adNOVE = Function(inage) var ndvi = inage.norealizedbi fference([ 88: return image adéBands (ndv); h » *B4"]) srenane(ndvi "D5 1 Map the functions over the collection var withNdvi = 4ilterediasked smap(scaleValues) -map(acanovt) ; U1 Display a time-series chart ‘var chart = ui.chart.inage.series(( AmageCollection: withNdvi.select(‘ndvi"), region: geonetry, reducer: ee,Reducer.ean(), scale: 10 hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton srs. 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) ))-setoptions(¢ Linewicth: 1, pointsize: 2, title: ‘NOVI Tine Series’, interpolatenulis: true, vixss: (title: *KOVE"), hiocis: (title: '", format: "VYYY-Mt") Ds print (chart) Exercise Try in Code Editor - (https:/code-earthengine google.co in/?sriptPath=users%«2Fujavalgandh1%2FEnd-to-End-GEEY3AO2-Earth-Engine- Intermediate%2F06c.Time.Series Charts (exercise}) // belete the farm boundary fron the previous script 17 and odd another farm at a Location of your choice U1 rin the chart. Assignment 2 ‘Assignment2 Expected Outout Try in Code Editor 7 (https:/code-earthengine.googleco iy scriptPath=users%e2FujavalgandhIM2FEnd-to-End- GEESGAAssignments%2FAssignment2) hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton 3eit6: 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var terraclinate = ee, InageCol Lect ion(“IDAHO_EPSCOR/TERRACLIMATE™); var geonetry = ee.Geonetry.Point({77-54849920033682, 12.91215162480037]) ; U1 Assignment // Use Terraclinate dataset to chart a 5@ year tine series // of tenporature at any Location Workflow 17 (oad the Terractinate collection U1 Select the ‘tawx' band 71 Scale the band values // Filter the scaled collection to the desired date renge // Use ut.Chart. image. series() function to create the chart 1 wit 1 The ‘tanx' band has @ scaling factor of 8.1 as per U1 wetps://developers. google. com/earth-engine/datasets/catalog/ IbAHO_EPSCOR_TERRACLIMATEbandS // This means that we need to multiply each pixel value by @.1 // to obtain the actual tenparature value 1 wap() a function and multiply each imoge var tnax = terracl imate. select(*tamx'); // Function that applies the scaling factor to the each ‘mage // Multiplying creates @ new inage that doesn't have the same properties // Use copyProperties() function to copy timestamp te new ‘mage var scaleinage = function(inage) { return image.nultiply(@.2) copyProperties(inage,(systen:tine_start! J); h Var tnaxScaled = taax.map(scaletnage); 1 win U1 You will need to specify pixel resolution as the scale parameter // in the charting function // Use projection() nowinalscate() to find the U1 iaage resolution in meters var inage = ce.Tnage(terraclinate.#irst()) print (inage.projection() .nominalscale()) hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton 39s. 412825, 3:34 PM End-to-End Google Earth Engine (Full Course) Module 3: Supervised Classification Introduction to Machine Learning and Supervised Classification Supervised classification is arguably the most important classical machine learning techniques in remote sensing. Applications range from {generating Land Use/Land Cover maps to change detection. Google Earth Engine is unique suited to do supervised classification at scale. The Interactive nature of Earth Engine development allows for iterative development of supervised classification workflows by combining many diferent datasets into the model. This module covers basic supervised classification workflow, accuracy assessment, hyperparameter tuning and change detection. Netegtia eee SWeepiec Classification https: mwwyoutube-com/watch? v=IULweRpkMv8élist=PLppGmFLhQ1HJulb7qMiklv1 1HiEQhyShaSindex=4) Watch the Video » (httpsi/mwvzyoutube.com/watch?v=IULweRokMvB&lis=PLopGmFLnQ1H Julb7aMikiv 1HIEQhyShaSindex=4) 01. Basic Supervised Classification We will earn how to doa basic land cover classification using training samples collected from the Code Editor using the High Resolution basemap imagery provided by Google Maps. This method requires no prior training data ands quite effective to generate high quality Classification samples anywhere in the world. The goals to classify each source pixel nto one ofthe following classes - urban, bare, water or vegetation. Using the drawing tools in the code editor, you create 4 new feature collection with points representing pixels of that class. Each feature collection has aproperty called landcover with values of, 1,2 or 3 indicating whether the feature collection represents urban, bare, water or vegetation respectively, We then traina Random Forest classifier using these training set to build a model and apply itto all the pixels of the image to create a4 class image. Fun fact: The classifiers in Earth Engine API have names starting with smile - such as ee. Classifier. smileRandonForest() . The smile part refers to the Statistical Machine Intelligence and Learning Engine (SMILE) (https://haifengl.github.io/index html) JAVA library which is used by Google Earth Engine to implement these algorithms. Supervised Classification Output (Open in Code Editor »(https//code.earthengine.google.co.n/?scriptPath=users%2F ujavalgandhité2FEnd-to-End-GEE%3A03-Supervised Classifcation’s2F01d, Basic Supervised, Classiication_{noimport} ntps:fcoursesspatilthoughts.comfend-to-end-gee.himiintoducion aos 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var bangolore = ee. FeatureColection( ‘users /ujavalgandhi /public/bangalore_boundary"); var geonetry = bangalore. geonetry(); var s2 = ee. nageCol lection( ‘COPERNICUS/S2_SR_KARMONIZED' ); // The following collections were created using the 1 braving Tools in the code editor ‘var urban = ee. FeatureCollection( 'users/ujavalgandhi /e2e/urban gps"); var bare = ee.FeatureCollection( 'users/ujavalgandhi/eze/bare_geps"); var water = e¢.FeatureCollection(*users/ujavalgandni/e2e/water_geps"); var vegetation = ee.FeatureCollection(user's/ujaval gandhi /e2e/ vegetation cps"); var filtered = 32 -Fllter(ee,Filter.18(CLOUDY_PIXEL_PERCENTAGE', 38)) -filter(ee.Filter.date(’20i9-01-61", '2428-01-01")) #ilter(ee. Filter bounds (geonetry)) sselect('B.*")5 var composite = Fltered.nedian(); // Display the input composite. var rgbvis = ¢ rnin: 8.0, nox: 3200, bands: ['84", "83", *82°), bs Map.addlayer(conposite.clip(geonetry), rgVis, “inage’); var gcps = urban.merge(bare) .nerge(water) .nerge(vegetation); // overlay the point on the tmage to get training data. var training = composite. sanpletegions({ collection: geps, properties: [‘landcover'), Ds // Train a classifier. ar classifier = ee.Classifien.snileRandonForest(s@) .train({ Features: training, lassProperty: ‘landcover*, InputProperties: composite.bandNanes() Ds 1 If Classify the inoge, var classified = conposite, classify(classifier); // Choose a 4-color palette 17 Assign a color for each class tn the following onder // Urban, Bare, water, Vegetation var palette = ['ccsdse", "##Fc107", 1e68e5', “toesade" ]; Mop.adéLayer(classified.clip(geonetry), (min: @, max: 3, palette: palette), °2019"); 1 Display the GCS 11 We use the style() function to style the 6cPs var palette = e¢,List(palette); var landcover = ee.List([@, 1, 2 3])s var gepsstyled = ee.Featurecollection( andcover -nap(Function(2c){ var color = palette.get(landcover.indexOF(1¢)); var markerStyle = { color: ‘white’, pointShape: ‘diamond’ , pointsize: 4, width: 1, fillcolor: color}; return geps.filter(ee.Filter.eq(‘landcover", 1¢)) map (Function(point) hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton ane 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) return point.set('style', markerstyle) > ))) flatten); Map.addlayer(gepsstyled.style({styleProperty:"style"}), (, ‘6cPs") Mop.centerdbject (gcasstyled) Exercise Try in Code Editor 7 (htps:/code-earthengine.googleco in/2seriptPath Classifcation%42FOtc, Basic Supervised Classification (exercise) 1ers%2Fujavalgandhise2FEnd-to-End-GEE%3A03-Supervised hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton ane 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var 52 © ee. tnageCol ection( ‘COPERNICUS/S2_SR_ARMONIZED! ); 1/ Perforn supervised classification for your ctty // Devete the georetry below and draw a polygon U1 ver your chosen city ‘var geonetry = e¢.Geonetry.Polygon({ (77.4143, 13.1203), [77.4109 12.7308), [77.8090, 12.7308], [77.8090, 13.1203) Ds Map.centerdbject (geonetry); var filtered = 52 -fitter(ce.Filter.1€("CLOUDY_PIXEL_PERCENTAGE, 38)) -filter(ee.Filter date(’2019-81-81", '220-01-01")) #i2ter(ee. Filter bounds geonetry)) select('B.*"); var conposite = Filtered.median(); // visplay the input composite var rgbVis = (min: 0.0, max: 3800, bands: ["84", “93°, °82°}; Nap.addtayer(conposite.clip(geonetry), rgoVis, ‘inage'); U1 exercise U1 Add teaining points for 4 classes U1 kssign the 'Landcover' property as foLLows U1 urban: @ V1 bare: 1 UM water: 2 7 vegetation: 3 11 After adding potnts, uncaments Lines below // var gcps = urban.nerge(bare).merge(water) merge(vegetation); 11 /1 overtay the point on the tage to get training data // var training = composite. sampLeRegions({ 77 collection: gcps, 1 properties: [Landcover' }, W~ scate: 10, W7 eitescate: 16 WY: U1 print(training); WI Train 0 classifier. U1 var classifier = ce.Classifier. swileRandonForest(s8).train({ 11 features: training, 71 classProperty: ‘Landcover’, 1 inputProperties: composite, bandvanes() WY UU I Classify the image. 11 var classified = conposite.classify(classifier); WL Choose @ 4-color palette 7/11 Bssign @ color For each class in the following order V1 11 Urban, Bare, water, Vegetation hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton ase 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) U1 var palette = ['accsdss', ‘#ffe1e7', ‘#tesses', ‘#e0adea" J; 1 Map-addvayer(classified.clip(geonetry), {min: 8, max: 3, palette: palette), ‘2019'); 02. Accuracy Assessment Itis important to get a quantitative estimate ofthe accuracy of the classification. To do this, a common strategy Isto divide your training samples into 2 random fractions -one used for traning the madel and the ather for validation of the predictions. Once a classifi trained, it canbe used to classify the entire image. We can then compare the classified values with the ones in the validation fraction, We can use the fe. Classifier.confustontatrix() method to calculate a Confusion Matrix representing expected accuracy. Classification results are evaluated based on the following metrics ‘+ Overall Accuracy: How many samples were classified correctly. ‘+ Producer's Accuracy: How well did the classification predict each lass. + Consumer's Accuracy (Reliability): How reliable isthe prediction in each class. ‘+ Kappa Coefficient: How well the classification performed as compared to random assignment, Accuracy Assessment Don't get carried away tweaking your model to give you the highest validation accuracy. You must use both qualitative measures (such as visual inspection of results) along with quantitative measures to assess the results. (Open in Code Eslitor »(https//code.earthengine google co.n/?scriptPath-users%2F ujavalgandhi3s2FEnd-to-End-GEE%3AQ3-Supervised- Classifcation®42F02b Accuracy. Assessment_(complete)) hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton aae 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var 52 = ee. .nageCol lection( ‘COPERNICUS/S2_SR_KARMONIZED! ); var basin = ee.FeatureColLection( "Wilf /HydraSHEDS/v1/Basine /hybas_7"); var gep = ee.Featurecollection(“users/ujaval gandht/e2e/arkavathy_gcps"): var arkavathy = basin.filten(ee.Filter.eq( HYBAS_10", 4971139640)); var geonetry = arkavathy.geonetry(); Map.centerobject (geometry); var rgbvis nin: 8.8, ax: 3200, bands: ['84", "83", °82"), » var filtered = 52 -Filter(ge.Filter.1€(*CLOUDY_PIXEL_PERCENTAGE", 38)) itten(ce. Filter. date( "2619-81-01", '2028-81-61')) -Fllter(ee.Filter,bounds geonetry)) sselect("B.*")5 var composite = filtered.nedian(); // Display the input composite. Mep.adalayer(conposite.clip(geonetry), rgbvis, ‘inage’); /1 dd a random column ond split the GcPs into training and validation set var gcp = gep.randoncolunn(); /1 This being a stapler classification, we take 66x points U1 for validation. Normal recomended ratio is /1 70% training, 38% validation var trainingScp = gcp.filter(ee.Filter.1t(*randon’, 0.6))3 var valicationGcp = gep.filter(ee.Filter.gte(‘randon', ©.6))5 // overlay the point on the image to get training data. var training = conposite. sampletegions({ collection: trainingsep, Properties: ['landcover'}, scale: 18, tilescale: 16 Ds // Tain a classifier var classifier = e¢,Classifier.smileRandonForest(5®) erain(( Features: training, classProperty: ‘Landcover*, inputProperties: composite bandNanes() De 17 Classify the inoge var classified = conposite, classify(classifier); var palette = [‘#ec6dBF", ‘#fFc107", “HlesBeS", "#004640" ]5 Map.addLayer(classified.clip(geonetry), (min: @, max: 3, palette: palette), '2819"); /eoeveietnenesnsenesneaennsasensiaeateseaeactagesaseackasenninenenetage U1 kecuracy Assessment // Use classification mop ta assess accuracy using the validetion fraction 11 of the overall training set creoted above. var test = classified. sanpletegions(( hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton a5iit6 4126125, 3:24 PM collection: validationscp, Properties: ['landcover}, tilescale: 16, scale: 10, Ds End-to-End Google Earth Engine (Full Course) ‘var testconfusiontatrix = test.errortatrix("landcover', ‘classification") 1 Printing of confusion matrix may tine out Print ("Confusion Matrix’, testconfusiontatrix); print("Test Accuracy’, testConfusionatrix.accuracy()): // Alternate workflow // This {3 stmilor to machine Learning practice var valication = conposite.sanpleRegions({ collection: validationsep, properties: [‘landcover"), tilescale: 16 Ds var test = validation. classify(classifier); var testConfusionMiatrix = test erroratrix(‘landcover", Alternatively, you can expont it as CSV classification") // brining of confusion matrix may tine out. Alternatively, you can export i as CSV print('Confusion Matrix’, testConfusiontiatrix); print("Test Accuracy", testConfusiontatrix. accuracy()); Exercise Try in Code Editor 7 (https:fcode-earthengine googleco.in/?scriptPath Classifcations42F02c_Accuracy_Assessment (exercisel) hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton ssers%2Fujavalgandhite2FEnd-to-End-GEE%2A03-Supervised aerite 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var conposite = e¢.tnage(users/ujavalgandhi/e2e/arkavathy_2019_composite' ); var gcp = ee. FeatureCollection( ‘users/ujavalganchi/e2e/arkavathy_scps"); var gep = gep-randoncolunn(); var trainingscp = gcp.Filter(ee.Filter.1t(‘random’, 0.6))5 var valigationscp ~ gep.filter(ee.Filter.gte(‘randon', 0.6)); var training = composite. sanpleRegions({ collection: trainingsep, properties: ['landcover"), scale: 18, tilescale: 16 ve // tain a classifier, var classifier = e,Classifier.smileRanconForest(50) ain(( Features: ning, classProperty: ‘lendcover", InputProperties: composite bandkanes() Ds 1 Classify the image. var classified = conposite.classify(classifier); 17 Accuracy Assessment // Use classification mop to assess accuracy using the validation fraction U1 of the overall training set created above, var test = classified, sonplenegions({ collection: vaLidationscp, Properties: [‘landcover'}, tilescale: 16, scale: 18, Ds vvar testconfusiontatrix = test.errortatrix(‘landcover", ‘classification’); print('Confusion Matrix’, testconfusiontatrix); accuracy()) 5 print("Test Accuracy", testConfusiontat U1 exercise // Catcutate and print the following assessment metrics 7/4, Producer's accuracy W/ 2. Consumer's accuracy U3. Floseore // Hints Look at the ee.Confustontatrix module for appropriate methods 03. Improving the Classification ted for machine learning tasks because ofits abil The Earth Engine data-model s especially well ditferent classes easily. Here we take the same example and augment it with the following techniques “+ Apply Cloud Masking ‘+ Add Spectral Indices: We ad bands for different spectralindices such as - NDVI, NDBI, MNDWI and BSI ‘+ Add Elevation ond Slope: We also add slope and elevation bands from the ALOS DEM. “+ Normalize the Inputs: Machine learning madels work best when all the inputs have the same scale. We willdivide each band with the ‘maximum value. Thismethod ensures that all input values are between 0-1. Amore complete and robust technique for image normalization i provided inthe course Supplement. hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton ty to easily incorporate data sources of diferent spatialresolutions, projections and data types together By giving additional information to the classe, itis able to separate are 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) (Our training features have more parameters and contain values ofthe same scale. The result is a much improved classification, Improved Classification Accuracy with use of Spectral Indices and Elevation Data (Open in Code Esltor »(httas//eode.earthengine.google.co.in/?scriptPa Classifcation®s2FO3b_Improving.the Classification {complete} ssers2FujavalgandhiXs2FEnd-to-End-GEE%3A03 hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton aarite 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var 52 = ee. tnageCol lection( ‘COPERNICUS/S2_SR_KARMONIZED! ); var basin = e¢.FeatureCollection( "WiF/HydreshEOS/v1/Sasins/hybas_7"); var gcp = ee. Featurecollection( “users/ujavalgandhi/e2e/arkavathy_seps")s var alos = ee. InageColection("AXA/ALOS /AW3030/V3_2"); var arkavathy = basin.filter(ee.Filter.eq{ HVBAS_10", 4871139640)); ‘var geonetry Map centerdbject (geonetry): rkavathy. geometry (); var filtered = 52 filter (ge.Filter. 1¢(*CLOUDY_PIXEL_PERCENTAGE', 30)) filten(ee.Filter.date("2019"81-01", '2028-81-@1')) -filter(oe.Filter,bounds geometry) 1 (oad the Cloud Scores collection var csPlus = e.InageCollection( 'GOOGLE/CLOUD_SCORE_PLUS/V1/S2_HARMONIZED"); ‘var csPlusiands = csPlus.first() bandvanes (); // We need to add Cloud Score + bands to each Sentinel-2 17 inage in the collection U1 This 1s done using the LinkCoLLection() function var filtereds2uithcs = filtered. 1inkCollection(csPlus, csPlusBands); // Function to mask pixels with Low CS+ Qa scores. Function maskLowgA(inage) { var gaBand = 'cs"; var clearthresheld = 9.5) var mask = Inage.select qatand) .gte(clearTheeshold); return image upcatettask mask); > var filteredasked = Flteredsawithes smap(aaskLow0A) select('B.*"); var conposite = filteredMasked.nedian(); var addindices = function(inage) ( var dvi = inage.norealizedDifference({'B8", '84"]).renane([‘névi"}); var pdbi = Inage.norealizedDitference(['B11', 'B8')) .rename({‘ndbs"]); var mncxi = inage.nornalizedoifference({'83", 'B11']).renane({'nndw'); var bsi = image. expression( (HYD = HB) MOXY) HB) "x": Amage.select ("611"), //swint "V's Image.select("B4"), //red ‘A’: image.select("88"), // mir "o's image.select("62"), // blue ))erenare("bsi"); return image.adcBands (ndvi) .adcBands(ndot) .acdBands (nndwi) .adatands(bsi) w var composite = addindices (composite); // Calculate Stope and Elevation U1 Use ALOS World 30 v3 // this cones a3 @ collection of images 1/ We mosaic it to create a single image // Weed t0 set the projection correctly on the mosaic U1 for the slope computation var proj = alos. first() .projection(); hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton aoe 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) ‘var elevation = alos.select('9sH").nosaic() -setbefaultProjection(proj) srenane(“elev'); var slope = ee.Terrain.slope(elevation) rrenane("slope"); var composite = conposite.addBands elevation) .add8ands(slope) ; var visParans = (bands: ["G8', °83', '02"], min: 8, max: 3000, ganma: 1.2}; Map.addLayer(conposite.clip(geonetry), visParans, ‘RGB"); // Normalize the ‘nage 1 machine Learning algorithms work best on images when all features have U1 the sone range // Function to Normalize Inoge 71 Pixel Values should be between @ and 1 11 Formula is (x ~ xin) / (max ~ nmin) Function normalize tnage)( var bandNanes = inage.bandanes(); 1 Compiste min and max of the ‘nage var indict = image.reduceRegion({ reducer: ee.Reducer.nin(), geonetry: geometry, scale: 18, naxPixels: 169, besteffort: true, tilescale: 16 Ds var maxbict = inage.reduceRegion({ reducer: ee.Reducer.tax(), geometry: geometry, scale: 18, naxPixels! 168, besterfort: true, tilescale: 16 Ds var mins = ee. Inage.constart(ninDict.values(bandNanes)); var maxs = ee. Inage. constant naxDict.values(bandNanes)); var norealized = inage.subtract(mins) .divide(aaxs subtract (mins)); return normalized; > var composite = normalize(conposite); 17 Béd a random column and split the GCPs into training and validation set var gcp = gep.randoncolunn(); /1 This being 9 simpler classification, we take 66K points U1 for validation. Normal recomended ratio is 11 70% training, 38% validation var trainingScp = gcp.filter(ee.Filter.1t(*randon”, 0.6)); vvar validationGcp = gep.filter(ee.Filter.gte("randen", @.6))5 // overlay the point on the image to get training dato. var training = composite. samplekegions({ collection: trainingsep, Properties: ['landcover'}, 18, tilescale: 16 hitps:fcourses.spatalnoughs.com/enct-o-end-gee.ntmlfintroduction sorts 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) Ds print (training); U1 rain a classifier. var classifier = e¢.Classifier.smileRandonForest(s@) strain(( Features: aiming, classProperty: “landcover*, AnputProperties: composite. bandvanes() Ds /1 Classify the iwage var classified = conposite, classify(classifier)s vor palette = [‘#cesdsf", ‘M#Fc107", “#lesaes", “#oAdAa" ]; Map.addLayer(classified.clip(geonetry), (min: ©, max: 3, palette: palette), °2619"); U1 kecuracy Assessment // Use classification map to assess accuracy using the validation fraction I of the overall training set created above, ‘var test = classified. sanpleRegions(( collection: validationGep, Properties: [‘andcover"}, tilescale: 16 Ds var testConfusiontatrix = test.erronfatrix("landcover", classification’); U1 vrinting of confusion matrix may tine out. Alternatively, you can export i as CSV Print ("Confusion Matrix’, testconfusiontatrix); Print("Test Accuracy’, testConfustontlatrix.accuracy()); Exercise Try in Code Editor ~ (https:/code-earthengine google co in/scriptPath~users%e2FujavalgandhI%2FEnd-to-End-GEE%3A03-Supervised- ClassifcationS42F03c_ Improving. the, Classification (exercise) U1 exercise U1 Improve your classification from Exercise 1c // Béd different spectral indictes to your composite U1 by using the function betow var adéindices = funetion(inage) { var ndvi = inage.norealizedDifference({ "88", '84"]).renane([‘névi")); var doi = inage.nontalizedbitference(['B11", 'B8")).renane({ nabs"); mage nornalizedDifference({*83", "B11"))-renaze((‘ndwi"}); var bsi = iage.expression( (HYD = HB) MOXY) F&O) L X': inage.select("B12"), //swirt "Y's Anage.select ("68"), //red ‘AY: Image.select("88"), // mir "o': image.select("s2"), // blue Yerenave("bsi"); return smage.adcBands (ndvi).addBands(ndbi) .addBands (nndwi) .addands (bsi) by hitpssfcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton sins. 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) 04. Exporting Classification Results When working with complex classifiers over large regions, ou may get a User memory limit exceeded or Computation timed out error inthe Code Editor. The reason for this is that there is2 fixed time limit and smaller memory allocated for code thats run withthe On-Demand Computation made. For larger camputations, you can use the Batch mode with the Export functions. Exports run in the background and can run longer than S-minutes time allocated to the computation code run from the Code Editor. This allows you to process very large and complex datasets. Heres an example showing how to export your classification results to Google Drive Exported Classification Outputs (Open in Code Editor »(https//eode.earthengine.google.co.in/?scriptPa Classifcation’42F04b Exporting. Classification Results. (complete) 1sers%2Fulavalgandhité2FEnd-to-End-GEE%GAO3-Supervised= hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton sais. 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) var 52 = e¢..nageCol lection( ‘COPERNICUS/S2_SR_KARMONIZED! ); var basin = e¢.FeatureCollection("hif/HydrashEDS/v1/Basine /hybas_7°); var gcp = ee. FeatureCollection( “‘users/ujavalgandhi/e2e/arkavathy_scps)s var alos = ee. InageColection("JAXA/ALOS/AW3030/V3_2"); vvar arkavathy = basin. filter(ee.Filter.0q("H¥SAS_10", 4071139640)); var geonetry = arkavathy.geonetry(); Map.centerdbject (geonetry); var gbvis = ax: 3200, bands: ['84", "83", *82"), bs var filtered = 52 -Fllter(we.Filter,1¢(*CLOUDY_PIXEL_PERCENTAGE', 38)) -filter(ee.Filter.date(’2619-81-01", '228-21-01")) -Filten(oe. Filter. bounds(geonetry)) /1 Load the Cloud Scores collection var csPlus = e¢,TnageCollection('GOOGLE/CLOUD_SCORE_PLUS/VI/S2_HARWONTZED"); var csPlustands = csPlus.first() .bandNanes (3 // We need to add Cloud Score + bonds to each Sentinel-2 11 image in the collection U1 This 1s done using the LinkCoLLection() function var filteredsaMithcs = filtered. JinkCollection(csPlus, csPlusBands); U1 Function to mask pixels with Low CS+ Qa scores. Function maskLowgAa(inage) { var gaBand = ‘es! var clearThresheld = 8.5; var mask = inage.select qaband) .gte(clearThreshold); return image upcatetask mask); > ‘var FilteredHasked = filtereds2Withes smap(aaskLow0A) sselect('B.*"); var conposite = filteredMasked.nedian(); var adéindices = function(inage) ¢ var ndvi = inage.norealizedDifference({ "88", '84"]).renane([‘névi"]); var doi = inage.nontalizedbitference([ "B11", 'B8")).cename({ nabs"); var mncwi = inage.nornalizeddifference({ "83", ‘B11'}).renave({‘nndwi"]); var bsi = image.expression( CY) = HB) MOXY) HHO) TE X's inage.select("B12"), //swirt "V's dmage-select('B4"), //red mage. select(‘88"), // nir "ar: image.select("82"), // blue D)erenaee("bsi"); return smage.addBands (ndvi) .adcBands(ndbi) .acdBands (nndwi) .addands (bsi) bs ‘var composite = acdindices (composite); U1 Calculate Slope and ELevation U1 Use ALOS World 30 v3 hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton sstt6. 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) // This comes as a collection of tnages // We mosaic it to create a single image // Weed to set the projection correctly on the nosate U1 for the slope computation ‘var proj = alos.Finst().projection()s var elevation = alos.select(‘DS4").mosaic() -setDefaultProjection(pro}) srenane(“elev"); var slope = ee.Terrain.slope(elevation) srenane('slope"); var composite = conposite.addBands elevation) .adé8ands(slope) ; var visParaes = (bands: ['B4', "63", "62"], min: @, max: 3000, ganna: 1.2} Map.addlayer(conposite.clip(geonetry), visParans, 'RGB"); 11 Normalize the {nage // Wachine Learning algorithms work best on tnages when all features have U1 the sone range // Function to Normalize Troge 71 Pixel Values should be between @ and 1 11 Formula is (x ~ xin) / (xmax ~ xin) function normalize inage)( var bandNanes = image.bandNanes(); 17 Conpuste min and max of the ‘nage var minDict = inage.reduceRegion({ reducer: ee.Reducer.nin(), geometry: geometry, scale: 18, naxPixels: 169, besteffort: true, tilescale: 16 Ds var maxDict » image.reduceRegion({ reducer: ee.Reducer.nax(), geonetry: geonetry, scale: 18, axPixels: 169, besteffort: true, tilescale: 16 Ds var nins = e0. nage. constant(ninDict values (bandNames)); var maxs = ee. Inage.constant(naxDict values (bandNanes)); var norealized = inage.subtract(mins) .divide(naxs.subtract (mins)); return normalized; ) var composite = normalize(composite); // kd 2 random colunn and split the CPs into training and validation set var gcp = gep.randoncolunn(); // This being a simpler classifleation, we take 68x points U1 for validation. Normal recomended ratio is /1 70% sraining, 36% validation var trainingscp = gcp.filter(ee.Filter.1t(*random’, 0.6))3 var validationGcp = gep.filter(ee.Filter.gte(‘randon', 0.6)); hitps:fcourses.spatalnoughs.com/enct-o-end-gee.ntmlfintroduction sais. 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) // Qverlay the point on the inage to get training dato. var training = composite. sanpleRegions({ collection: trainingsep, properties: [‘landcover'), scale: 18, tilescale: 16, Ds print (training); // Train a classifier. var classifier = e¢,Classifier,smileranconForest(5®) strain( features: training, lassProperty: ‘landcover*, AnputProperties: composite. bandvanes() Ds U/ Classify the inage: Var classified = conposite, classify(classitier)s var palette = ["#cesdsf", ‘##Fc107", ‘lesses’, ‘roasdde" ]; Map.adéLayer(classified.clip(geometry), (min: ©, max: 3, palette: palette), °2619"); U1 kecuracy Assessment // Use classification mop ta assess accuracy using the validetion fraction U1 of the overatt training set created above, var test = classified. sanplekegions(( collection: validationsep, Properties: [‘landcover"}, tilescale: 16 Ds var testConfusiontiatrix = test-erronatrix("landcover", classification’); 11 Exporting Results 11 txport the classified inage to orive 1 For images having integers (such as class numbers) H1 we cast the tage to floating potnt data type which U1 allows the masked values to be saved a5 NaN values 1/ 4a the GeotTFF format, // You can set these to actual WoDate values using 1 OAL tools after the export 7/ adal_translote -a_nodata ‘non’ input.tif output. tif Export. image. tobrive({ nage: classiied.clip(geonetry) toFloat(), description: ‘Classified_Inage_ Export’, folder: ‘earthengine’, FilenamePrefix: “classified”, region: geonetry, naxPixels? 1610 » /1 kxport the results of accuracy asssessnent hitps:fcourses.spatalnoughs.com/enct-o-end-gee.ntmlfintroduction 5518 4126125, 3:24 PM End-to-End Google Earth Engine (Full Course) // Coeate @ Feature with null geonetry and the value we want to export. U1 Use .array() to convert Confusion Matrix to an Array so it can be // exported in a CSV file var fe = e¢.Featurecollection({ ee. Feature(null, ( ‘accuracy’: testConfustontatrix. accuracy(), matrix’: testConfusiontatrsx.array() » Ds print (fe); Export. table. toprive({ collection: fe, description: ‘Accuracy Assesssent_Export", folder: ‘earthengine', FilenanePrefix: “accuracy*, Fileforeat: "CSV" Ds Exercise Itisalsoa good idea to export the classihed image as an Asset. This will alows you to import the classified image in another script without running the whole classification workflow. Use the Export image-toAsset) function to export the classified image as an asset. Try in Code Editor - (htps:/code.carthengine.google ‘var FilteredHasked = filtereds2withes smap(aaskLowoA) sselect('B.*"); var conposite = filteredMasked.nedian(); var adéindices = function(inage) ¢ var ndvi = inage.norealizedDifference({ "88", '84"]).renane({"névi"]); var di = inage.nontalizedoitference(['B11", 'B8")).renane({ nabs"); var mndwi = inage.nornalizeddifference({ "83", ‘B11'}).renave({‘nndwi"]); var bsi = image.expression( (HYD = HB) MOKED) BO) TE X's inage.select("B12"), //swirt "V's dmage.select('B4"), //red mage. select(‘88"), // nir "a": Amage.select("82"), // blue ))erenaee("bsi"); return smage.adcBands (ndvi) .addBands(ndbi) .acdBands (nndwi) .addands (bsi) by ‘var composite = acdindices (composite); 11 Calculate Slope ond ELevation U1 Use ALOS World 30 v3 hitps:fcourses.spatalnoughs.com/enc:-o-end-gee.ntmlfintroducton srs. 4126125, 3:24 PM // This comes as a collection of tnages // We mosaic it to create a single image // Weed to set the projection correctly on the mosate U1 for the slope computation ‘var proj = alos.Finst().projection(); var elevation = alos.select(‘DS4").mosaic() -setDefaultProjection(pro}) srenane(“elev); var slope = ee.Terrain.slope(elevation) renane("slope"); var composite = conposite.addBands elevation) .adé8ands(slope) ; vvar visParaes = (bands: ['B4', 63", "62"], min: 9, max: 3000, ganna: Map.addlayer(conposite.clip(geonetry), visParans, 'RGB"); 11 Normalize the {nage End-to-End Google Earth Engine (Full Course) 2.2} // Wachine Learning algorithms work best on images when all features have /1 the sone range // Function to Normalize Troge 71 Pixel Values should be between @ and 1 11 Formula is (x ~ xin) / (xmax ~ xin) function normalize inage)( var bandNanes = inage.bandNanes(); 17 Conpuste min and max of the ‘nage var minDict = inage.reduceRegion({ reducer: ee.Reducer.nin(), geometry: geonetry, scale: 18, naxPixels: 129, besteffort: true, tilescale: 16 Ds var maxDict » image.reduceRegion { reducer: ee.Reducer.nax(), geonetry: geonetry, scale: 18, hnaxPixels: 169, besteffort: true, tilescale: 16 Ds var mins = e0. nage. constant(ninDict. values (bandNanes)); var maxs = ee. Inage. constant (naxDict.values(bandNanes)); var nonealized = inage.subtract(mins) .divide(naxs.subtract (mins)); return normalized; ) var composite = normalize(conposite); // 4d a random colunn and split the CPs into training and validation set var gcp = gep.randoncolunn(); // This being a simpler classifleation, we take 68x points U1 for validation. Normal recomended ratio is /1 70% sraining, 36% validation var trainingscp = gcp.filter(ee.Filter.1t(*randon’, 0.6)); var validationGcp = gep.filter(ee.Filter.gte(‘randon', 0.6)); hitps:fcourses.spatalnoughs.com/enct-o-end-gee.ntmlfintroduction ssi

You might also like