Jump to content

Lurker

Moderators
  • Content Count

    4,139
  • Joined

  • Last visited

  • Days Won

    368

Lurker last won the day on March 25

Lurker had the most liked content!

Community Reputation

2,326 Celebrity

About Lurker

  • Rank
    Associate Professor
  • Birthday 02/13/1983

Profile Information

  • Gender
    Male
  • Location
    INDONESIA
  • Interests
    GIS and Remote Sensing

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. A possible design flaw has surfaced by DJI Mavic Mini owners experiencing the propellers rubbing against the drone‘s body due to the lightweight and flexible design of the arms. It looks to be a known problem with people already working on fixes for it. Images have appeared on DJI’s community website with a user posting images of their Mavic Mini with what looks be marks caused by the propellers contacting the body. The user had shared that they recently started to notice the front set of propellers tend to hit the drone body, even if new ones were put on. He also stated it was more likely to happen when the drone is in sport mode. It looks like the cause of the problem is the flexibility in the arms used due to the weight restrictions DJI was able to keep within. The arms can flex due to the torque of the motors, causing the propellers to get closer to the drone body, and therefore rub against it. It looks to be a known problem for some as a company has already produced prop mounts that raise the height of the propellers just enough to clear the body fully. There are also designs on Thingeverse that replace the soft foam inserts in the arms with stiff 3D printed plastic. Both of these options will likely set the Mavic Mini over the 250-gram weight limit. source: https://dronedj.com/2020/03/24/dji-mavic-mini-design-flaw-pops-hit-done-body/
  2. Five new remote-sensing satellites were sent into planned orbit from the Jiuquan Satellite Launch Center in northwest China's Gobi Desert Thursday. The five satellites were launched by a Long March-11 carrier rocket at 2:42 p.m. (Beijing Time). The satellites belong to a commercial remote-sensing satellite constellation project "Zhuhai-1," which will comprise 34 micro-nano satellites, including video, hyperspectral, and high-resolution optical satellites, as well as radar and infrared satellites. The carrier rocket was developed by the China Academy of Launch Vehicle Technology, and the satellites were produced by the Harbin Institute of Technology and operated by the Zhuhai Orbita Aerospace Science and Technology Co. Ltd. Thursday's launch was the 311th mission for the Long March series carrier rockets. The newly launched satellites comprise four hyperspectral satellites with 256 wave-bands and a coverage width of 150 km, and a video satellite with a resolution of 90 centimeters. The Zhuhai-1 hyperspectral satellites have the highest spatial resolution and the largest coverage width of their type in China. The data will be used for precise quantitative analysis of vegetation, water and crops, and will provide services for building smart cities, said Orbita, the largest private operator of hyperspectral satellites in orbit. The company aims to cooperate with government organizations and enterprises to expand the big data satellite services. source: https://www.spacedaily.com/reports/China_launches_new_remote_sensing_satellites_999.html
  3. sorry i cant see your picture, our country block imgur, really suck 😷
  4. did you mean for function area is Spherical function area?
  5. Take a look on this: links: https://www.cambridge.org/core/what-we-publish/textbooks untested, maybe you need make free user account first they have nice collection of engineering and geosciences books https://www.cambridge.org/core/what-we-publish/textbooks/listing?aggs[productSubject][filters]=F470FBF5683D93478C7CAE5A30EF9AE8 https://www.cambridge.org/core/what-we-publish/textbooks/listing?aggs[productSubject][filters]=CCC62FE56DCC1D050CA1340C1CCF46F5
  6. The British Geological Survey (BGS) has amassed one of the world’s premier collections of geologic samples. Housed in three enormous warehouses in Nottingham, U.K., it contains about 3 million fossils gathered over more than 150 years at thousands of sites across the country. But this data trove “was not really very useful to anybody,” says Michael Stephenson, a BGS paleontologist. Notes about the samples and their associated rocks “were sitting in boxes on bits of paper.” Now, that could change, thanks to a nascent international effort to meld earth science databases into what Stephenson and other backers are describing as a “geological Google.” This network of earth science databases, called Deep-time Digital Earth (DDE), would be a one-stop link allowing earth scientists to access all the data they need to tackle big questions, such as patterns of biodiversity over geologic time, the distribution of metal deposits, and the workings of Africa’s complex groundwater networks. It’s not the first such effort, but it has a key advantage, says Isabel Montañez, a geochemist at University of California, Davis, who is not involved in the project: funding and infrastructure support from the Chinese government. That backing “will be critical to [DDE’s] success given the scope of the proposed work,” she says. In December 2018, DDE won the backing of the executive committee of the International Union of Geological Sciences, which said ready access to the collected geodata could offer “insights into the distribution and value of earth’s resources and materials, as well as hazards—while also providing a glimpse of the Earth’s geological future.” At a meeting this week in Beijing, 80 scientists from 40 geoscience organizations including BGS and the Russian Geological Research Institute are discussing how to get DDE up and running by the time of the International Geological Congress in New Delhi in March 2020. DDE grew out of a Chinese data digitization scheme called the Geobiodiversity Database (GBDB), initiated in 2006 by Chinese paleontologist Fan Junxuan of Nanjing University. China had long-running efforts in earth sciences, but the data were scattered among numerous collections and institutions. Fan, who was then at the Chinese Academy of Sciences’s Nanjing Institute of Geology and Paleontology, organized GBDB around the stacks of geologic strata called sections and the rocks and fossils in each stratum. Norman MacLeod, a paleobiologist at the Natural History Museum in London who is advising DDE, says GBDB has succeeded where similar efforts have stumbled. In the past, he says, volunteer earth scientists tried to do nearly everything themselves, including informatics and data management. GBDB instead pays nonspecialists to input reams of data gleaned from earth science journals covering Chinese findings. Then, paleontologists and stratigraphers review the data for accuracy and consistency, and information technology specialists curate the database and create software to search and analyze the data. Consistent funding also contributed to GBDB’s success, MacLeod says. Although it started small, Fan says GBDB now runs on “several million” yuan per year. Earth scientists outside China began to use GBDB, and it became the official database of the International Commission on Stratigraphy in 2012. BGS decided to partner with GBDB to lift its data “from the page and into cyberspace,” as Stephenson puts it. He and other European and Chinese scientists then began to wonder whether the informatics tools developed for GBDB could help create a broader union of databases. “Our idea is to take these big databases and make them use the same standards and references so a researcher could quickly link them to do big science that hasn’t been done before,” he says. The Beijing meeting aims to finalize an organizational structure for DDE. Chinese funding agencies are putting up $75 million over 10 years to get the effort off the ground, Fan says. That level of support sets DDE apart from other cyberinfrastructure efforts “that are smaller in scope and less well funded,” Montañez says. Fan hopes DDE will also attract international support. He envisions nationally supported DDE Centers of Excellence that would develop databases and analytical tools for particular interests. Suzhou, China, has already agreed to host the first of them, which will also house the DDE secretariat. DDE backers say they want to cooperate with other geodatabase programs, such as BGS’s OneGeology project, which seeks to make geologic maps of the world available online. But Mohan Ramamurthy, project director of the U.S. National Science Foundation–funded EarthCube project, sees little scope for collaboration with his effort, which focuses on current issues such as climate change and biosphere-geosphere interactions. “The two programs have very different objectives with little overlap,” he says. Fan also hopes individual institutions will contribute, by sharing data, developing analytical tools, and encouraging their scientists to participate. Once earth scientists are freed of the drudgery of combing scattered collections, he says, they will have time for more important challenges, such as answering “questions about the evolution of life, materials, geography, and climate in deep time.” source: https://www.sciencemag.org/news/2019/02/earth-scientists-plan-meld-massive-databases-geological-google
  7. i dont see your links references, seems gis stackexchange delete it, im not familiar with this tool, so please explain about 6 fragmentation categories in ArcGIS, do you want to reclassification the GUIDOS Raster?
  8. The Gridded Population of the World (GPW) collection, now in its fourth version (GPWv4), models the distribution of human population (counts and densities) on a continuous global raster surface. Since the release of the first version of this global population surface in 1995, the essential inputs to GPW have been population census tables and corresponding geographic boundaries. The purpose of GPW is to provide a spatially disaggregated population layer that is compatible with data sets from social, economic, and Earth science disciplines, and remote sensing. It provides globally consistent and spatially explicit data for use in research, policy-making, and communications. For GPWv4, population input data are collected at the most detailed spatial resolution available from the results of the 2010 round of Population and Housing Censuses, which occurred between 2005 and 2014. The input data are extrapolated to produce population estimates for the years 2000, 2005, 2010, 2015, and 2020. A set of estimates adjusted to national level, historic and future, population predictions from the United Nation’s World Population Prospects report are also produced for the same set of years. The raster data sets are constructed from national or subnational input administrative units to which the estimates have been matched. GPWv4 is gridded with an output resolution of 30 arc-seconds (approximately 1 km at the equator). The nine data sets of the current release are collectively referred to as the Revision 11 (or v4.11) data sets. In this release, several issues identified in the 4.10 release of December 2017 have been corrected as follows: The extent of the final gridded data has been updated to a full global extent. Erroneous no data pixels in all of the gridded data were recoded as 0 in cases where census reported known 0 values. The netCDF files were updated to include the Mean Administrative Unit Area layer, the Land Area and Water Area layers, and two layers indicating the administrative level(s) of the demographic characteristics input data. The National Identifier Grid was reprocessed to remove artefacts from inland water. In addition, two attributes were added to indicate the administrative levels of the demographic characteristics input data, and the data set zip files were corrected to include the National Identifier Polygons shapefile. Two new classes (Total Land Pixels and Ocean Pixels) were added to the Water Mask. The administrative level names of the Greece Administrative Unit Centre Points were translated to English. Separate rasters are available for population counts and population density consistent with national censuses and population registers, or alternative sources in rare cases where no census or register was available. All estimates of population counts and population density have also been nationally adjusted to population totals from the United Nation’s World Population Prospects: The 2015 Revision. In addition, rasters are available for basic demographic characteristics (age and sex), data quality indicators, and land and water areas. A vector data set of the centrepoint locations (centroids) for each of the input administrative units and a raster of national level numeric identifiers are included in the collection to share information about the input data layers. The raster data sets are now available in ASCII (text) format as well as in GeoTIFF format. Five of the eight raster data sets are also available in netCDF format. In addition, the native 30 arc-second resolution data were aggregated to four lower resolutions (2.5 arc-minute, 15 arc-minute, 30 arc-minute, and 1 degree) to enable faster global processing and support of research communities that conduct analyses at these resolutions. All of these resolutions are available in ASCII and GeoTIFF formats. NetCDF files are available at all resolutions except 30 arc-second. All spatial data sets in the GPWv4 collection are stored in geographic coordinate system (latitude/longitude).
  9. done all, please be more active or you will send back to inactive members again, and it will be forever
  10. really nice, is it possible to leverage into forecast? that would be interesting
  11. New algorithm solves complex problems more easily and more accurately on a personal computer while requiring less processing power than a supercomputer The exponential growth in computer processing power seen over the past 60 years may soon come to a halt. Complex systems such as those used in weather forecast, for example, require high computing capacities, but the costs for running supercomputers to process large quantities of data can become a limiting factor. Researchers at Johannes Gutenberg University Mainz (JGU) in Germany and Università della Svizzera italiana (USI) in Lugano in Switzerland have recently unveiled an algorithm that can solve complex problems with remarkable facility – even on a personal computer. Exponential growth in IT will reach its limit In the past, we have seen a constant rate of acceleration in information processing power as predicted by Moore's Law, but it now looks as if this exponential rate of growth is limited. New developments rely on artificial intelligence and machine learning, but the related processes are largely not well-known and understood. "Many machine learning methods, such as the very popular deep learning, are very successful, but work like a black box, which means that we don't know exactly what is going on. We wanted to understand how artificial intelligence works and gain a better understanding of the connections involved," said Professor Susanne Gerber, a specialist in bioinformatics at Mainz University. Together with Professor Illia Horenko, a computer expert at Università della Svizzera italiana and a Mercator Fellow of Freie Universität Berlin, she has developed a technique for carrying out incredibly complex calculations at low cost and with high reliability. Gerber and Horenko, along with their co-authors, have summarized their concept in an article entitled "Low-cost scalable discretization, prediction, and feature selection for complex systems" recently published in Science Advances. "This method enables us to carry out tasks on a standard PC that previously would have required a supercomputer," emphasized Horenko. In addition to weather forecasts, the research see numerous possible applications such as in solving classification problems in bioinformatics, image analysis, and medical diagnostics. Breaking down complex systems into individual components The paper presented is the result of many years of work on the development of this new approach. According to Gerber and Horenko, the process is based on the Lego principle, according to which complex systems are broken down into discrete states or patterns. With only a few patterns or components, i.e., three or four dozen, large volumes of data can be analyzed and their future behavior can be predicted. "For example, using the SPA algorithm we could make a data-based forecast of surface temperatures in Europe for the day ahead and have a prediction error of only 0.75 degrees Celsius," said Gerber. It all works on an ordinary PC and has an error rate that is 40 percent better than the computer systems usually used by weather services, whilst also being much cheaper. SPA or Scalable Probabilistic Approximation is a mathematically-based concept. The method could be useful in various situations that require large volumes of data to be processed automatically, such as in biology, for example, when a large number of cells need to be classified and grouped. "What is particularly useful about the result is that we can then get an understanding of what characteristics were used to sort the cells," added Gerber. Another potential area of application is neuroscience. Automated analysis of EEG signals could form the basis for assessments of cerebral status. It could even be used in breast cancer diagnosis, as mammography images could be analyzed to predict the results of a possible biopsy. "The SPA algorithm can be applied in a number of fields, from the Lorenz model to the molecular dynamics of amino acids in water," concluded Horenko. "The process is easier and cheaper and the results are also better compared to those produced by the current state-of-the-art supercomputers." The collaboration between the groups in Mainz and Lugano was carried out under the aegis of the newly-created Research Center Emergent Algorithmic Intelligence, which was established in April 2019 at JGU and is funded by the Carl Zeiss Foundation. https://www.uni-mainz.de/presse/aktuell/10864_ENG_HTML.php
  12. Johns Hopkins University has put out a heat map in response to the public health emergency that updates the number of confirmed coronavirus cases across the world. According to its website, the map was developed using data from WHO, CDC, China CDC, China National Health Commission and Dingxiangyuan – a website which reportedly aggregates data from Chinese government sources in “near real-time.” As of Friday, more than 900 cases of coronavirus have been confirmed and 26 people have died from the virus, the map shows. The majority of the confirmed cases have been in mainland China with 916 sickened. However, dozens of cases in Illinois are under investigation after the Centers for Disease Control and Prevention confirmed a second case of coronavirus in the United States. links: https://gisanddata.maps.arcgis.com/apps/opsdashboard/index.html#/bda7594740fd40299423467b48e9ecf6
  13. With Huawei basically blocked from using Google services and infrastructure, the firm has taken steps to replace Google Maps on its hardware by signing a partnership with TomTom to provide maps, navigation, and traffic data to Huawei apps. Reuters reports that Huawei is entering this partnership with TomTom as the mapping tech company is based in the Netherlands — therefore side-stepping the bans on working with US firms. TomTom will provide the Chinese smartphone manufacturer with mapping, live traffic data, and software on smartphones and tablets. TomTom spokesman Remco Meerstra confirmed to Reuters that the deal had been closed some time ago but had not been made public by the company. This comes as TomTom unveiled plans to move away from making navigation hardware and will focus more heavily on offering software services — making this a substantial step for TomTom and Huawei. While TomTom doesn’t quite match the global coverage and update speed of Google Maps, having a vital portion of it filled by a dedicated navigation and mapping firm is one step that might appease potential global Huawei smartphone buyers. There is no denying the importance of Google app access outside of China but solid replacements could potentially make a huge difference — even more so if they are recognizable by Western audiences. It’s unclear when we may see TomTom pre-installed on Huawei devices but we are sure that this could be easily added by way of an OTA software update. The bigger question remains if people are prepared to switch from Google Maps to TomTom for daily navigation. resource: https://9to5google.com/2020/01/20/huawei-tomtom/
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.