Jump to content

Lurker

Moderators
  • Posts

    4,559
  • Joined

  • Last visited

  • Days Won

    567

Lurker last won the day on May 25

Lurker had the most liked content!

About Lurker

  • Birthday 02/13/1983

Profile Information

  • Gender
    Male
  • Location
    INDONESIA
  • Interests
    GIS and Remote Sensing

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Lurker's Achievements

  1. For most uses, Google Maps is a flat, 2D app, and if your device can handle more graphics and a bit more data, you can fire up the Google Earth 3D data set and get 3D buildings. At Google I/O Google has announced a new level that turns the graphics slider way, way up on Google Maps: Immersive View. When exploring an area in Google Maps, the company says Immersive View will make it "feel like you’re right there before you ever set foot inside." The video for this feature is wild. It basically turns Google Maps into a 3D version of SimCity with AAA video game graphics. There are simulated cars that drive through the roads, and birds fly through the sky. Clouds pass overhead and cast shadows on the world. The weather is simulated, and water has realistic reflections that change with the camera. London even has an animated Ferris wheel that spins around. Google can't possibly be tracking things like the individual positions of birds (yet!), but a lot of this is real data. The cars represent the current traffic levels on a given street. The weather represents the actual weather, even for historical data. The sun moves in real time with the time of day. Another part of the video shows flying into a business that also has a whole 3D layout. All of this is possible thanks to combining the massive data sets from Google Maps, Google Earth, and Street View, but even then, this level of fidelity will be very limited by the initial data sets. Google says that at first, Immersive View will "start... rolling out in Los Angeles, London, New York, San Francisco, and Tokyo later this year with more cities coming soon." The company says that "Immersive view will work on just about any phone and device," but just like the 3D building mode, this will be an optional toggle.
  2. Massive earthquakes don’t just move the ground — they make speed-of-light adjustments to Earth’s gravitational field. Now, researchers have trained computers to identify these tiny gravitational signals, demonstrating how the signals can be used to mark the location and size of a strong quake almost instantaneously. It’s a first step to creating a very early warning system for the planet’s most powerful quakes, scientists report May 11 in Nature. Such a system could help solve a thorny problem in seismology: how to quickly pin down the true magnitude of a massive quake immediately after it happens, says Andrea Licciardi, a geophysicist at the Université Côte d’Azur in Nice, France. Without that ability, it’s much harder to swiftly and effectively issue hazard warnings that could save lives. As large earthquakes rupture, the shaking and shuddering sends seismic waves through the ground that appear as large wiggles on seismometers. But current seismic wave–based detection methods notoriously have difficulty distinguishing between, say, a magnitude 7.5 and magnitude 9 quake in the few seconds following such an event. That’s because the initial estimations of magnitude are based on the height of seismic waves called P waves, which are the first to arrive at monitoring stations. Yet for the strongest quakes, those initial P wave amplitudes max out, making quakes of different magnitudes hard to tell apart. But seismic waves aren’t the earliest signs of a quake. All of that mass moving around in a big earthquake also changes the density of the rocks at different locations. Those shifts in density translate to tiny changes in Earth’s gravitational field, producing “elastogravity” waves that travel through the ground at the speed of light — even faster than seismic waves. Such signals were once thought to be too tiny to detect, says seismologist Martin Vallée of the Institut de Physique du Globe de Paris, who was not involved in the new study. Then in 2017, Vallée and his colleagues were the first to report seeing these elastogravity signals in seismic station data. Those findings proved that “you have a window in between the start of the earthquake and the time at which you receive the [seismic] waves,” Vallée says. But researchers still pondered over how to turn these elastogravity signals into an effective early warning system. Because gravity wiggles are tiny, they are difficult to distinguish from background noise in seismic data. When scientists looked retroactively, they found that only six mega-earthquakes in the last 30 years have generated identifiable elastogravity signals, including the magnitude 9 Tohoku-Oki earthquake in 2011 that produced a devastating tsunami that flooded two nuclear power plants in Fukushima, Japan (SN: 3/16/11). (A P wave–based initial estimate of that quake’s magnitude was 7.9.) That’s where computers can come in, Licciardi says. He and his colleagues created PEGSNet, a machine learning network designed to identify “Prompt ElastoGravity Signals.” The researchers trained the machines on a combination of real seismic data collected in Japan and 500,000 simulated gravity signals for earthquakes in the same region. The synthetic gravity data are essential for the training, Licciardi says, because the real data are so scarce, and the machine learning model requires enough input to be able to find patterns in the data. Once trained, the computers were then given a test: Track the origin and evolution of the 2011 Tohoku quake as though it were happening in real time. The result was promising, Licciardi says. The algorithm was able to accurately identify both the magnitude and location of the quake five to 10 seconds earlier than other methods. This study is a proof of concept and hopefully the basis for a prototype of an early warning system, Licciardi says. “Right now, it’s tailored to work … in Japan. We want to build something that can work in other areas” known for powerful quakes, including Chile and Alaska. Eventually, the hope is to build one system that can work globally. The results show that PEGSNet has the potential to be a powerful tool for early earthquake warnings, particularly when used alongside other earthquake-detection tools, Vallée says. Still, more work needs to be done. For one thing, the algorithm was trained to look for a single point for an earthquake’s origin, which is a reasonable approximation if you’re far away. But close-up, the origin of a quake no longer looks like a point, it’s actually a larger region that has ruptured. If scientists want an accurate estimate of where a rupture happened in the future, the machines need to look for regions, not points, Vallée adds. Bigger advances could come in the future as researchers develop much more sensitive instruments that can detect even tinier quake-caused perturbations to Earth’s gravitational field while filtering out other sources of background noise that might obscure the signals. Earth, Vallée says, is a very noisy environment, from its oceans to its atmosphere. “It’s a bit the same as the challenge that physicists face when they try to observe gravitational waves,” Vallée says. These ripples in spacetime, triggered by colossal cosmic collisions, are a very different type of gravity-driven wave (SN: 2/11/16). But gravitational wave signals are also dwarfed by Earth’s noisiness — in this case, microtremors in the ground.
  3. you want to combine? use merge in arctoolbox, you can merge all. or if you want to do it manually, just start editing, and then do it one by one select 2 lines and then with using edit tool, click merge
  4. Governments and businesses across the world are pledging to adopt more sustainable and equitable practices. Many are also working to limit activities that contribute to climate change. To support these efforts, Esri, the global leader in location intelligence, in partnership with Impact Observatory and Microsoft, is releasing a globally consistent 2017–2021 global land-use and land-cover map of the world based on the most up-to-date 10-meter Sentinel-2 satellite data. In addition to the new 2021 data, 10-meter land-use and land-cover data for 2017, 2018, 2019, and 2020 is included, illustrating five years of change across the planet. This digital rendering of earth’s surfaces offers detailed information and insights about how land is being used. The map is available online to more than 10 million users of geographic information system (GIS) software through Esri’s ArcGIS Living Atlas of the World, the foremost collection of geographic information and services, including maps and apps. “Accurate, timely, and accessible maps are critical for understanding the rapidly changing world, especially as the effects of climate change accelerate globally,” said Jack Dangermond, Esri founder and president. “Planners worldwide can use this map to better understand complex challenges and take a geographic approach to decisions about food security, sustainable land use, surface water, and resource management.” Esri released a 2020 global land-cover map last year as well as a high-resolution 2050 global land-cover map, showing how earth’s land surfaces might look 30 years from now. With the planned annual releases, users will have the option to make year-to-year comparisons for detecting change in vegetation and crops, forest extents, bare surfaces, and urban areas. These maps also provide insights about locations with distinctive land use/land cover, as well as human activity affecting them. National government resource agencies use land-use/land-cover data as a basis for understanding trends in natural capital, which helps define land-planning priorities and determine budget allocations. Esri’s map layers were developed with imagery from the European Space Agency (ESA) Sentinel-2 satellite, with machine learning workflows by Esri Silver partner Impact Observatory and incredible compute resources from longtime partner Microsoft. The Sentinel-2 satellite carries a range of technologies including radar and multispectral imaging instruments for land, ocean, and atmospheres, enabling it to monitor vegetation, soil and water cover, inland waterways, and coastal areas. “World leaders need to set and achieve ambitious targets for sustainable development and environmental restoration,” said Steve Brumby, Impact Observatory cofounder and CEO. “Impact Observatory [and] our partners Esri and Microsoft are once again first to deliver an annual set of global maps at unprecedented scale and speed. These maps of changing land use and land cover provide leaders in governments, industry, and finance with a new AI [artificial intelligence]-powered capability for timely, actionable geospatial insights on demand.” Esri and Microsoft have released this 10-meter-resolution time-series map under a Creative Commons license to encourage broad adoption and ensure equitable access for planners working to create a more sustainable planet. Users can manipulate the map layers and other data layers with GIS software to create more dynamic visualizations. In addition to being freely available in ArcGIS Online as a map service, these resources are also available for download and viewing. To explore the new 2021 global land-use/land-cover map, visit livingatlas.arcgis.com/landcover.
  5. L3Harris Technologies third high-resolution weather instrument is set to launch March 1 onboard a NOAA satellite – strengthening the nation’s ability to monitor the environment and rapidly detect severe weather. The Advanced Baseline Imager (ABI) is the primary instrument for the Geostationary Operational Environmental Satellite-T (GOES-T), the third in a series of four advanced geostationary weather satellites with L3Harris’ ABI onboard. The ABIs are controlled by L3Harris’ enterprise ground system. The ABI provides high-resolution video of weather and environmental systems using 16 spectral bands delivering three times the amount of spectral coverage, four times the resolution and five times faster than the previous generation of GOES satellites. The Advanced Baseline Imagers on NOAA’s two current geostationary operational satellites, GOES-East and GOES-West, enable more accurate meteorological forecasts, greater ability to study and monitor climate change, and allow experts to provide early warnings of severe weather conditions such as tornadoes, wildfires, and hurricanes. “L3Harris’ ABI has helped NOAA improve detection of wildfires, tornadoes and other extreme events that threaten lives,” said Rob Mitrevski, Vice President and General Manager, Spectral Solutions, Space and Airborne Systems, L3Harris. “We have been pioneers in space-based weather monitoring for more than 60 years and continue to set a high standard of capability with our Advanced Baseline Imager. We look forward to driving further forecasting advancements, as we continue our collaborative partnership with NOAA into the future.” The company also announced the delivery of its fourth imager to NASA in late 2021. The fourth and final ABI was integrated into the GOES-U satellite last month and is slated to launch in 2024. GOES-U will complete NOAA’s GOES-R series of advanced geostationary weather sensors and provides the groundwork for future Geostationary Extended Observations (GeoXO) imager programs, currently in the Phase A Formulation stage, with L3Harris underway with the next generation geostationary imager concept design.
  6. At this time, USGS Landsat 9 Collection 2 Level-1 and Level-2 data will be made available for download from EarthExplorer, Machine to Machine (M2M), and LandsatLook. Initially, USGS will provide only full-bundle downloads. USGS will provide single band downloads and browse images, and Landsat 9 Collection 2 U.S. Analysis Ready Data shortly thereafter. Commercial cloud data distribution will take 3-5 days to reach full capacity. The recently deployed Landsat 9 satellite passed its post-launch assessment review and is now operational. This milestone marks the beginning of the satellite’s mission to extend Landsat's unparalleled, 50-year record of imaging Earth’s land surfaces, surface waters, and coastal regions from space. Landsat 9 launched September 27, 2021, from Vandenberg Space Force Base in California. The satellite carries two science instruments, the Operational Land Imager 2 (OLI-2) and the Thermal Infrared Sensor 2 (TIRS-2). The OLI–2 captures observations of the Earth’s surface in visible, near-infrared, and shortwave-infrared bands, and TIRS-2 measures thermal infrared radiation, or heat, emitted from the Earth’s surface. Landsat 9 improvements include higher radiometric resolution for OLI-2 (14-bit quantization increased from 12-bits for Landsat 8), enabling sensors to detect more subtle differences, especially over darker areas such as water or dense forests. With this higher radiometric resolution, Landsat 9 can differentiate 16,384 shades of a given wavelength. In comparison, Landsat 8 provides 12-bit data and 4,096 shades, and Landsat 7 detects only 256 shades with its 8-bit resolution. In addition to the OLI-2 improvement, TIRS-2 has significantly reduced stray light compared to the Landsat 8 TIRS, which enables improved atmospheric correction and more accurate surface temperature measurements. All commissioning and calibration activities show Landsat 9 performing just as well, if not better, than Landsat 8. In addition to routine calibration methods (i.e., on-board calibration sources, lunar observations, pseudo invariant calibration sites (PICS), and direct field in situ measurements), an underfly of Landsat 9 with Landsat 8 in mid-November 2021 provided cross-calibration between the two satellites’ onboard instruments, ensuring data consistency across the Landsat Collection 2 archive. Working in tandem with Landsat 8, Landsat 9 will provide major improvements to the nation’s land imaging, sustainable resource management, and climate science capabilities. Landsat’s imagery provides a landscape-level view of the land surface, surface waters (inland lakes and rivers) and coastal zones, and the changes that occur from both natural processes and human-induced activity. “Landsat 9 is distinctive among Earth observation missions because it carries the honor to extend the 50-year Landsat observational record into the next 50 years,” said Chris Crawford, USGS Landsat 9 Project Scientist. Partnered in orbit with Landsat 8, Landsat 9 will ensure continued eight-day global land and near-shore revisit.” Since October 31, 2021, Landsat 9 has collected over 57,000 images of the planet and will collect approximately 750 images of Earth each day. These images will be processed, archived, and distributed from the USGS Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota. Since 2008, the USGS Landsat Archive has provided more than 100 million images to data users around the world, free of charge. Landsat 9 is a joint mission between the USGS and NASA and is the latest in the Landsat series of remote sensing satellites. The Landsat Program has been providing global coverage of landscape change since 1972. Landsat’s unique long-term data record provides the basis for a critical understanding of environmental and climate changes occurring in the United States and around the world. Data Availability Learn more about Landsat 9 data access Visit the Landsat 9 webpages to learn more about the latest mission: USGS Landsat 9 NASA Landsat 9
  7. ah, seems i need to take the line also, before github officially launch it, LOL
  8. cool..... is it still on early stage right? beta?
  9. NASA has partnered with other US agencies and a Colorado-based drone manufacturer to create a UAV-based system to observe volcanoes for indications of what otherwise might be unexpected, catastrophic eruptions for local communities. NASA has long been interested in deploying drones to replace human monitors and small aircraft habitually used for near-range examination of volcano activity and for gathering evidence of approaching eruptions. Tipped off to those by data collected with sensors on the craft, scientists could issue advance warnings when craters are getting ready to blow, as Cumbre Vieja on Spain’s La Palma, Karkatau in Indonesia, and Hawai‘i Island’s Kīleaua all did last year. Though the tech presumably wouldn’t be applicable to the underwater explosion of Hunga Tonga-Hunga Ha’apai that produced a calamitous tsunami for the people of Tonga, it would allow researchers to predict other looming eruptions ­– allowing local communities to prepare or evacuate beforehand. NASA’s first efforts to develop UAVs for that purpose date back to 2013, when a small craft was used to overfly the Turrialba Volcano in Costa Rica. The agency later struck up a long-running partnership with Black Wing Technologies, which produces drone systems for rugged enterprise and research applications. Their work eventually led to the creation of the S2 fixed-wing, fully autonomous plane that was prepared for data collection flights above the Aleutian Islands’ Makushin Volcano. Along the way, the project was joined by the US Geological Survey (USGS), which provided additional sensors to capture visual and thermal images, and detect a fuller range of volcanic gases. The S2 took to the air in a series of missions above Makushin last September. “We needed it to be really rugged, to withstand flying in the turbulent conditions and corrosive gases around volcanoes,” said Florian Schwandner, director of the Earth Sciences division at NASA’s Ames Research Center. “We also developed a gas-sensing payload the UAS could carry to look for signs of volcanic unrest.” To make the ground-breaking Makushin flights possible, however, the team had to convince the Federal Aviation Administration (FAA) to grant beyond visual line of sight (BVLOS) exemptions to enable the S2’s 15-mile flights to the cone. Once there, the drone gathered high-resolution visible-light and thermal images, which allowed the scientists to detect and map changes in physical features indicating volcanic activity – and possible eruption. The gas-detecting sensors similar passed their tests with flying colors. “Our goal is to continue to push the capabilities of UASs to provide valuable insight into natural phenomena,” said Jack Elston, Black Swift Technologies CEO. “This deployment demonstrated some state-of-the-art automation technologies we think will help greatly simplify what are now very difficult UAS operations. One of the most exciting results was to see our custom autopilot system determine when conditions had become too dangerous and turn back.” Perhaps just as critically for the future of drone deployment in monitoring volcanoes for eruptions, the BVLOS operational mode – considered a must for UAV in tracking and responding to a diverse range of urgent situations around the world – was also a resounding success. “Working with NASA and Black Swift, our scientists believe we can use UASs to help authorities warn communities about the onset of dangerous volcanic eruptions, and many other hazards that now take us by surprise,” said Jonathan Stock, director of the USGS National Innovation Center, adding the response to droughts, floods, and wildfires to that list. “With this tool, we could routinely monitor even remote volcanoes for activity and respond to eruption events – a gamechanger for the safety of both our scientists and the communities around these geologic hazards.”
  10. Earth is far from a solid mass of rock. The outer layer of our planet – known as the lithosphere – is made up of more than 20 tectonic plates; as these gargantuan slates glide about the face of the planet, we get the movement of continents, and interaction at the boundaries, not least of which is the rise and fall of entire mountain ranges and oceanic trenches. Yet there's some debate over what causes these giant slabs of rock to move around in the first place. Amongst the many hypotheses put forward over the centuries, convection currents generated by the planet's hot core have been discussed as an explanation, but it's doubtful whether this effect would produce enough energy. A newly published study looks to the skies for an explanation. Noting that force rather than heat is most commonly used to move large objects, the authors suggest that the interplay of gravitational forces from the Sun, Moon, and Earth could be responsible for the movement of Earth's tectonic plates. Key to the hypothesis is the barycenter – the center of mass of an orbiting system of bodies, in this case that of Earth and the Moon. This is the point around which our Moon actually orbits, and it's not directly in the center of mass of our planet, which we call the geocenter. Instead, the location of the barycenter within Earth changes over the course of the month by as much as 600 kilometers (373 miles) because the Moon's orbit around Earth is elliptical due to our Sun's gravitational pull. "Because the oscillating barycenter lies around 4,600 kilometers [2,858 miles] from the geocenter, Earth's tangential orbital acceleration and solar pull are imbalanced except at the barycenter," says geophysicist Anne Hofmeister, from Washington University in St. Louis. "The planet's warm, thick and strong interior layers can withstand these stresses, but its thin, cold, brittle lithosphere responds by fracturing." Further strain is added as Earth spins on its axis, flattening out slightly from a perfect spherical shape – and these three stresses from the Moon, Sun, and Earth itself combine to cause the shifting and the splitting of tectonic plates. "Differences in the alignment and magnitude of the centrifugal force accompanying the solar pull as Earth undulates in its complex orbit about the Sun superimpose highly asymmetric, temporally variable forces on Earth, which is already stressed by spin," the researchers write. What's happening underneath the surface is that the solid lithosphere and the solid upper mantle are being spun at different speeds because of these stresses and strains, the researchers report – all due to our particular Earth-Moon-Sun configuration. "Our uniquely large Moon and particular distance from the Sun are essential," says Hofmeister. Without the Moon, and the shifts it causes between the barycenter and the geocenter, we wouldn't see the tectonic plate activity we get on Earth's surface, the researchers argue. As the Sun's gravitational pull on the Moon is 2.2 times greater than Earth's pull, it will get drawn away from our planet over the next billion years or so. That said, the gravitational forces at play still need Earth's hot interior for all this to work, the researchers argue. "We propose that plate tectonics result from two different, but interacting, gravitational processes," they write. "We emphasize that Earth's interior heat is essential to creating the thermal and physical boundary layer known as the lithosphere, its basal melt, and the underlying low-velocity zone." To further validate the hypothesis outlined in their study, the researchers apply their analysis to several rocky planets and moons in the Solar System, none of which have had confirmed tectonic activity to date. Their comparison between Earth and the other major celestial bodies in the Solar System reveals a potential explanation for why we haven't detected tectonic activity on any of the major moons or rocky planets so far. The one closest to Earth in all the necessary parameters, however, is Pluto. "One test would be a detailed examination of the tectonics of Pluto, which is too small and cold to convect, but has a giant moon and a surprisingly young surface," says Hofmeister.
  11. New research provides further evidence that rocks representing up to a billion years of geological time were carved away by ancient glaciers during the planet's "Snowball Earth" period, according to a study published in Proceedings of the National Academy of Sciences. The research presents the latest findings in a debate over what caused the Earth's "Great Unconformity"—a time gap in the geological record associated with the erosion of rock up to 3 miles thick in areas across the globe. "The fact that so many places are missing the sedimentary rocks from this time period has been one of the most puzzling features of the rock record," said C. Brenhin Keller, an assistant professor of earth sciences and senior researcher on the study. "With these results, the pattern is starting to make a lot more sense." The massive amount of missing rock that has come to be known as the Great Unconformity was first named in the Grand Canyon in the late 1800s. The conspicuous geological feature is visible where rock layers from distant time periods are sandwiched together, and it is often identified where rocks with fossils sit directly above those that do not contain fossils. "This was a fascinating time in Earth's history," said Kalin McDannell, a postdoctoral researcher at Dartmouth and the lead author of the paper. "The Great Unconformity sets the stage for the Cambrian explosion of life, which has always been puzzling since it is so abrupt in the fossil record—geological and evolutionary processes are usually gradual." For over a century, researchers have sought to explain the cause of the missing geological time. In the last five years, two opposing theories have come into focus: One explains that the rock was carved away by ancient glaciers during the Snowball Earth period about 700 to 635 million years ago. The other focuses on a series of plate tectonic events over a much longer period during the assembly and breakup of the supercontinent Rodinia from about 1 billion to 550 million years ago. Research led by Keller in 2019 first proposed that widespread erosion by continental ice sheets during the Cryogenian glacial interval caused the loss of rock. This was based on geochemical proxies that suggested that large amounts of mass erosion matched with the Snowball Earth period. "The new research verifies and advances the findings in the earlier study," said Keller. "Here we are providing independent evidence of rock cooling and miles of exhumation in the Cryogenian period across a large area of North America." The study relies on a detailed interpretation of thermochronology to make the assessment. Thermochronology allows researchers to estimate the temperature that mineral crystals experience over time as well as their position in the continental crust given a particular thermal structure. Those histories can provide evidence of when missing rock was removed and when rocks currently exposed at the surface may have been exhumed. The researchers used multiple measurements from previously published thermochronometric data taken across four North American locations. The areas, known as cratons, are parts of the continent that are chemically and physically stable, and where plate tectonic activity would not have been common during that time. By running simulations that searched for the time-temperature path the rocks experienced, the research recorded a widespread signal of rapid, high magnitude cooling that is consistent with about 2-3 miles of erosion during Snowball Earth glaciations across the interior of North America. "While other studies have used thermochronology to question the glacial origin, a global phenomenon like the Great Unconformity requires a global assessment," said McDannell. "Glaciation is the simplest explanation for erosion across a vast area during the Snowball Earth period since ice sheets were believed to cover most of North America at that time and can be efficient excavators of rock." According to the research team, the competing theory that tectonic activity carved out the missing rock was put forth in 2020 when a separate research group questioned whether ancient glaciers were erosive enough to cause the massive loss of rock. While that research also used thermochronology, it applied an alternate technique at only a single tectonically active location and suggested that the erosion occurred prior to Snowball Earth. "The underlying concept is pretty simple: Something removed a whole lot of rock, resulting in a whole lot of missing time," said Keller. "Our research demonstrates that only glacial erosion could be responsible at this scale." According to the researchers, the new findings also help explain links between the erosion of rock and the emergence of complex organisms about 530 million years ago during the Cambrian explosion. It is believed that erosion during the Snowball Earth period deposited nutrient-rich sediment in the ocean that could have provided a fertile environment for the building blocks of complex life. The study notes that the two hypotheses of how the rock eroded are not mutually exclusive—it is possible that both tectonics and glaciation contributed to global Earth system disruption during the formation of the Great Unconformity. It appears, however, that only glaciation can explain erosion in the center of the continent, far from the tectonic margins. "Ultimately with respect to the Great Unconformity, it may be that the generally accepted reconstruction(s) of more concentrated equatorial packing of the Rodinian continents along with the unique environmental conditions of the Neoproterozoic, proved to be a time of geologic serendipity unlike most any other in Earth history," the research paper says. According to the team, this is the first research that uses their thermochronology modeling approach to study a period that extends well beyond a billion years. In the future, the team will repeat their work on other continents, where they hope to further test these hypotheses about how the Great Unconformity was created and preserved. According to the team, resolving differences in the research is critical to understanding early Earth history and the interconnection of climatic, tectonic and biogeochemical processes. "The fact that there may have been tectonic erosion along the craton margins does not rule out glaciation," said McDannell. "Unconformities are composite features, and our work suggests Cryogenian erosion was a key contributor, but it is possible that both earlier and later erosion were involved in forming the unconformity surface in different places. A global examination will tell us more." William Guenthner, from the University of Illinois at Urbana-Champaign; Peter Zeitler from Lehigh University; and David Shuster from the University of California, Berkeley and the Berkeley Geochronology Center served as co-authors of the paper.
  12. It is worth repeating that scientists know more about Mars, Venus, and the dark side of the moon than they know of Earth’s ocean depths. To date, less than 20 percent of the ocean floor has been mapped—13 percent in just the past four years. But scientists would like to map it all by 2030. It’s an essential undertaking, but it’s going to take dedicated effort, public support, and government funding. Such a project can be accomplished only through dedicated global cooperation. The payoff stands to be tremendous—for everything from ship navigation to climate modeling. A clear view of the ocean floor’s topography would allow for optimal siting of undersea cables and offshore wind turbines. It would show where deep-ocean fishing can be done safely and where it cannot. And with a clear three-dimensional understanding of ocean volume, meteorologists could better understand how typhoons and tsunamis travel and intensify as they cross the ocean, bringing storm surges to the shoreline. In addition, climate scientists could better measure the circulation of heat in the ocean and thus build better models of climate change. Climate change is the most basic and urgent reason to map the ocean as quickly as possible. Healthy oceans play an outsize role in minimizing climate change because they capture carbon emissions. But this capacity has limits. Excess carbon acidifies ocean waters, making life difficult for coral reefs and shellfish (oysters, mussels, snails, and clams). It also lowers the oxygen content of the water, impairing the ability of all sea life to breathe. Human practices that disturb the ocean floor—chiefly trawl fishing—make matters worse by releasing carbon from the ocean floor. Deep-sea mining, if it is allowed to go forward unmanaged, would have a similar effect and further disturb undersea ecosystems. To measure the progress of climate change and study the ocean processes and human activities that affect that progress, it is essential to assemble a detailed picture of the undersea world. Too many people are still thinking of the ocean as “out of sight, out of mind” and not relevant if they don’t live near it. This is a luxury we can no longer afford. The ocean floor is still too invisible—even to many people who work on climate change issues! For example, the original 1994 United Nations (UN) Framework Convention on Climate Change—on which all subsequent UN climate change frameworks are based—only recognized three marine and coastal ecosystems: mangroves, sea grasses, and salt marshes. To date, the shallow and deep parts of the ocean floor are still excluded from this framework, even though scientists now know what a huge storehouse of carbon the ocean floor is, despite having mapped only 19.7 percent of it in detail. In the fight against climate change, the UN and global governing bodies must include the ocean floor in global carbon accounting—the process of quantifying the greenhouse gases emitted in the atmosphere—and ocean scientists must continue to learn more about this storehouse for the sake of scientific and policy guidance. To gain this understanding, we must map, and we need all hands on deck. Scientists have the technology to get the job done. Today’s sonars are sensitive enough to map features of ocean water above the seafloor, including wave action, schools of fish, and changes in coral reefs that can indicate marine oxygen levels. GIS is now routinely used to analyze data from an array of sources—including sonar, satellites, submersible craft, and underwater cameras—to put together a three-dimensional picture of the underwater world and to study how best to manage and protect it. The Seabed 2030 initiative, an effort fully sanctioned by the UN and supported by the Nippon Foundation to map the ocean floor, has collected bathymetric (depth) data from governments and other data owners. Sensors carried aboard transoceanic cruise ships and cargo ships have gathered more data. And robots have been engaged to survey the ocean floor, similar to the way robots have been used to map the surface of Mars and other planets. To finish the job in due time, though, the initiative will require an extended commitment and further funding. Private sector partners are chipping in, including Vulcan, a philanthropic company founded by the late Paul Allen of Microsoft, and the Schmidt Ocean Institute, launched by Wendy Schmidt and Eric Schmidt of Google. But the amount of work ahead requires the kind of funding that only governments can provide. Larry Mayer, director of the Center for Coastal and Ocean Mapping at the University of New Hampshire, has calculated that $3 billion to $5 billion will be needed to finish this job. It’s a big price tag, but compare it with outlays for space travel and exploration—the National Aeronautics and Space Administration (NASA) is spending nearly $3 billion on the Perseverance Mars rover—and the dollar for dollar value is obvious. All the major seagoing science powers of the world—including the United States, the United Kingdom, France, Germany, and China—must contribute. Climate change—so vividly illustrated by a summer of fire and floods on land and documented by the Intergovernmental Panel on Climate Change’s (IPCC) recent historic report—has greatly increased the urgency to see the entire ocean in detail. The job can be done by the end of this decade if countries step up to the challenge. This, in great part, means embracing the vision of a planetary GIS that’s composed of a coordinated constellation of hubs and geoportals distributed both geographically (across regions and nations) and thematically (e.g., connecting teams that are tackling global mapping projects, such as Seabed 2030, and local organizations that have data and reporting about specific, in-country UN Sustainable Development Goal indicators). For the ocean and coastal user communities, these hubs and geoportals can catalog information items in a well-organized way and connect them to related maps, apps, analytical models, and other relevant data. And all this is built around a series of best practices adopted from a wide range of communities and events, including those that incorporate indigenous knowledge systems. Learn lessons and other resources present the workflows, approaches, and stories needed to bring these best practices to light. The technology is there. The time to use it to map Earth’s oceans is now. source: https://seabed2030.org/
  13. what 3d globe you want? ArcGIS desktop has own dedicated 3d globe like apps, called ArcGlobe or maybe you want to use google earth? that's also using globe apps maybe you need to specify what 3D globe apps you want to use
  14. An underwater volcano off Tonga erupted on Saturday, triggering tsunami warnings and evacuation orders in Japan and causing large waves in several South Pacific islands, where footage on social media showed waves crashing into coastal homes. Japan's meteorological agency issued tsunami warnings in the early hours on Sunday and said waves as high as three metres (9.84 feet) were expected in the Amami islands in the south. Waves of more than a metre were recorded there earlier. Public broadcaster NHK said no damage or casualties had been reported, interrupting its regular programming to report on the tsunami advisory spanning the entire eastern coast of the Japanese archipelago issued by the country's meteorological agency. In a briefing, a Japan Meteorological Agency official urged people not to go near the sea until the tsunami advisory and more serious tsunami warnings had been lifted. The warnings - the first in more than five years - covered several specific areas. He said the change in sea levels observed did not follow a familiar pattern of tsunamis following earthquakes. "We do not know yet whether these (waves) are actually tsunami," he told the briefing. Tsunami waves were observed in Tonga's capital and the capital of American Samoa, a U.S.-based tsunami monitor said, following the eruption at 0410 GMT of the Hunga Tonga-Hunga Ha'apai underwater volcano. The volcano, which lies about 65 km (40 miles) north of Nuku'alofa, caused a 1.2-metre (four-foot) tsunami wave, Australia's Bureau of Meteorology said. The bureau said it continued to monitor the situation but no tsunami threat had been issued to the Australian mainland, islands or territories. Tsunami waves of 83 cm (2.7 feet) were recorded by gauges in the Tongan capital of Nuku'alofa and two-foot waves were seen at Pago Pago, the capital of American Samoa, the Pacific Tsunami Warning Center said. Fiji issued a tsunami warning, urging residents to avoid the shorelines "due to strong currents and dangerous waves". Jese Tuisinu, a television reporter at Fiji One, posted a video on Twitter showing large waves crashing ashore, with people trying to flee in their cars. "It is literally dark in parts of Tonga and people are rushing to safety following the eruption," he said. There were no immediate reports of casualties. In New Zealand, the emergency management agency issued an advisory on tsunami activity for the country's north and east coasts, forecasting strong and unusual currents and unpredictable surges along shorelines in those areas. On Friday, the volcano sent ash, steam and gas up to 20 km (12 miles) into the air, Tonga Geological Services said in a Facebook post. It has a radius of 260 km (160 miles). A tsunami advisory was also in effect for the U.S. and Canadian Pacific coast, the Pacific Tsunami Warning Center in Honolulu said. The National Weather Service said tsunami waves along the Oregon and southern Washington coast were expected imminently. High waves were reported in Alaska and Hawaii earlier. In the San Francisco Bay Area of northern California, small parts of the cities of Berkeley and Albany near the bay were ordered to evacuate. source: reuter
  15. Mount Semeru, the tallest and most active volcano on the Indonesian island of Java, has routinely spit up small, mostly harmless plumes of ash and gas for years. The circumstances changed on December 4, 2021. Following a partial collapse of the summit lava dome early in December, sensors began to detect elevated seismic activity, according to the Volcanological Survey of Indonesia (PVMBG). After more of Semeru’s lava dome gave way, billowing fronts of superheated ash, tephra, soil, and other debris raced down several channels on the mountain’s southeastern flank. Pyroclastic flows are among the most dangerous hazards posed by volcanoes. Sometimes accelerating to speeds of hundreds of kilometers per hour, these masses of volcanic material and landscape debris can be impossible to outrun. They destroy most living things in their path. Though explosive eruptions at the summit were likely small, the pyroclastic flows at Mount Semeru on December 4 were still hot enough that they likely helped propel a billowing “Phoenix cloud” that rose as high as 15 kilometers (9 miles) into the air. Since heavy rains preceded and accompanied the eruption, the pyroclastic flows mixed with large amounts of rainwater and morphed into muddy lahars that rushed down the mountain into populated areas. Lahars are mixtures of water and volcanic debris that behave like rivers of concrete, flattening or burying much of what they encounter. The damage proxy map above shows areas on the surface that were likely damaged by pyroclastic flows and lahars in December 2021. Dark red pixels represent the most severe damage, while orange and yellow areas are moderately or partially damaged. Each colored pixel represents an area of 30 meters by 30 meters (about the size of a baseball infield). Researchers from the The Earth Observatory of Singapore - Remote Sensing Lab (EOS-RS) made the maps by comparing a post-eruption image from December 7, 2021, with a set of pre-eruption images from August 9, 2021, through November 21, 2021. The slurry of debris that swept down Semeru proved catastrophic to villagers living around the mountain’s base in the Lumajang Regency, particularly Curah Kobokan. According to The Jakarta Post, at least 39 people have died. Large numbers of homes were destroyed or damaged, and many animals are among the eruption’s victims. The maps were derived from synthetic aperture radar (SAR) images acquired by the Copernicus Sentinel-1 satellites, operated by the European Space Agency (ESA). The researchers used the Advanced Rapid Imaging and Analysis (ARIA) system originally developed at NASA’s Jet Propulsion Laboratory, California Institute of Technology, and modified at EOS-RS. The ARIA team is supported by NASA’s Earth Science Disasters Program.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.

Disable-Adblock.png

 

If you enjoy our contents, support us by Disable ads Blocker or add GIS-area to your ads blocker whitelist