In The Hitchhiker’s Guide to the Galaxy series there is a character named Slartibartfast that designs planets (Earth, version 2 after the original gets obliterated by the unpleasant Vogons etc. where he made a prize winning Norway etc.).
Presently, our skills are not as good as Slartibartfast, but there are a few notable attempts to reclaim damaged land and make it wild again. A new twist to this is to load the reclamation area with sensors. There too we are just upstarts since the best we have done so far is to make playing card deck size sensor boxes, and then litter the land with them.
That is about to change. Sensors so small that they make no impact on the ecological system are coming on line. The idea is to place millions of them within the landscape, organize the data that comes from them and learn about and experience virtually the ecosystem as it recovers.
The Dopellab at MIT has such a project where the output of data from the sensors is viewed in a “reality browser”. In this case this means a 3D game environment (Unity3D) riddled with representations of whatever the sensors pick up. This will get even more interesting as immersive tools, like goggles, better sound capture/playback systems and haptic gloves, to experience the virtual environment thus created, improve.
Naturally, we would also like to smell the bog as we walk through, and that device (the artificial nose) has already been invented and miniaturized so you might want to try your hand at designing the necessary accompanying playback schnozz.
PS I doubt any of this is making its way into the “common core”.
Here is a way to dip your toe, so to speak, into this scenario, virtually. Stanford offers a free course in haptic design. The board costs $35 on Amazon or Ebay, though you will have to construct the rest of the kit yourself.
https://lagunita.stanford.edu/courses/SelfPaced/Haptics/2014/about