Jump to content

All Activity

This stream auto-updates     

  1. Last week
  2. Hi, I've been gone for a while, could you please reactivate my account. Thanks.
  3. really nice, is it possible to leverage into forecast? that would be interesting
  4. Saw a similar news last month - Using Machine Learning to “Nowcast” Precipitation in High Resolution by Google. The result seemed pretty good. Here, A visualization of predictions made over the course of roughly one day. Left: The 1-hour HRRR prediction made at the top of each hour, the limit to how often HRRR provides predictions. Center: The ground truth, i.e., what we are trying to predict. Right: The predictions made by our model. Our predictions are every 2 minutes (displayed here every 15 minutes) at roughly 10 times the spatial resolution made by HRRR. Notice that we capture the general motion and general shape of the storm. The two method seem similar.
  5. New algorithm solves complex problems more easily and more accurately on a personal computer while requiring less processing power than a supercomputer The exponential growth in computer processing power seen over the past 60 years may soon come to a halt. Complex systems such as those used in weather forecast, for example, require high computing capacities, but the costs for running supercomputers to process large quantities of data can become a limiting factor. Researchers at Johannes Gutenberg University Mainz (JGU) in Germany and Università della Svizzera italiana (USI) in Lugano in Switzerland have recently unveiled an algorithm that can solve complex problems with remarkable facility – even on a personal computer. Exponential growth in IT will reach its limit In the past, we have seen a constant rate of acceleration in information processing power as predicted by Moore's Law, but it now looks as if this exponential rate of growth is limited. New developments rely on artificial intelligence and machine learning, but the related processes are largely not well-known and understood. "Many machine learning methods, such as the very popular deep learning, are very successful, but work like a black box, which means that we don't know exactly what is going on. We wanted to understand how artificial intelligence works and gain a better understanding of the connections involved," said Professor Susanne Gerber, a specialist in bioinformatics at Mainz University. Together with Professor Illia Horenko, a computer expert at Università della Svizzera italiana and a Mercator Fellow of Freie Universität Berlin, she has developed a technique for carrying out incredibly complex calculations at low cost and with high reliability. Gerber and Horenko, along with their co-authors, have summarized their concept in an article entitled "Low-cost scalable discretization, prediction, and feature selection for complex systems" recently published in Science Advances. "This method enables us to carry out tasks on a standard PC that previously would have required a supercomputer," emphasized Horenko. In addition to weather forecasts, the research see numerous possible applications such as in solving classification problems in bioinformatics, image analysis, and medical diagnostics. Breaking down complex systems into individual components The paper presented is the result of many years of work on the development of this new approach. According to Gerber and Horenko, the process is based on the Lego principle, according to which complex systems are broken down into discrete states or patterns. With only a few patterns or components, i.e., three or four dozen, large volumes of data can be analyzed and their future behavior can be predicted. "For example, using the SPA algorithm we could make a data-based forecast of surface temperatures in Europe for the day ahead and have a prediction error of only 0.75 degrees Celsius," said Gerber. It all works on an ordinary PC and has an error rate that is 40 percent better than the computer systems usually used by weather services, whilst also being much cheaper. SPA or Scalable Probabilistic Approximation is a mathematically-based concept. The method could be useful in various situations that require large volumes of data to be processed automatically, such as in biology, for example, when a large number of cells need to be classified and grouped. "What is particularly useful about the result is that we can then get an understanding of what characteristics were used to sort the cells," added Gerber. Another potential area of application is neuroscience. Automated analysis of EEG signals could form the basis for assessments of cerebral status. It could even be used in breast cancer diagnosis, as mammography images could be analyzed to predict the results of a possible biopsy. "The SPA algorithm can be applied in a number of fields, from the Lorenz model to the molecular dynamics of amino acids in water," concluded Horenko. "The process is easier and cheaper and the results are also better compared to those produced by the current state-of-the-art supercomputers." The collaboration between the groups in Mainz and Lugano was carried out under the aegis of the newly-created Research Center Emergent Algorithmic Intelligence, which was established in April 2019 at JGU and is funded by the Carl Zeiss Foundation. https://www.uni-mainz.de/presse/aktuell/10864_ENG_HTML.php
  6. Earlier
  7. Hi, Please make my account active again. I can assure you that will be more regular and active on the forum henceforth. Thank you
  8. Hi everyboy, I have found some old maps in CartoMap .cgf format. It is multiple file based GIS format like the shapefile (there is a .cgf file for the vector data and .dbf for the attributes etc.) Has anybody worked with that format and/or with the CartoMap application? I have found some dead websites where I have read about the freeware editor software (CartoMap pro), but in 2020 all download links are dead. There is a website (https://cartovcl.com/download-files/) where you can download Delphi based source codes, etc. but no .exe installer here. My goal: - to get that application - to convert the old files to shp/dxf/whatever Anyone have any idea?:) cheers
  9. Johns Hopkins University has put out a heat map in response to the public health emergency that updates the number of confirmed coronavirus cases across the world. According to its website, the map was developed using data from WHO, CDC, China CDC, China National Health Commission and Dingxiangyuan – a website which reportedly aggregates data from Chinese government sources in “near real-time.” As of Friday, more than 900 cases of coronavirus have been confirmed and 26 people have died from the virus, the map shows. The majority of the confirmed cases have been in mainland China with 916 sickened. However, dozens of cases in Illinois are under investigation after the Centers for Disease Control and Prevention confirmed a second case of coronavirus in the United States. links: https://gisanddata.maps.arcgis.com/apps/opsdashboard/index.html#/bda7594740fd40299423467b48e9ecf6
  10. Google announced Dataset Search, a service that lets you search for close to 25 million different publicly available data sets, is now out of beta. Dataset Search first launched in September 2018. Researchers can use these data sets, which range from pretty small ones that tell you how many cats there were in the Netherlands from 2010 to 2018 to large annotated audio and image sets, to check their hypotheses or train and test their machine learning models. The tool currently indexes about 6 million tables. With this release, Dataset Search is getting a mobile version and Google is adding a few new features to Dataset Search. The first of these is a new filter that lets you choose which type of data set you want to see (tables, images, text, etc.), which makes it easier to find the right data you’re looking for. In addition, the company has added more information about the data sets and the organizations that publish them. Searched 'remote sensing' and found this Geographic information A lot of the data in the search index comes from government agencies. In total, Google says, there are about 2 million U.S. government data sets in the index right now. But you’ll also regularly find Google’s own Kaggle show up, as well as a number of other public and private organizations that make public data available, as well. As Google notes, anybody who owns an interesting data set can make it available to be indexed by using a standard schema.org markup to describe the data in more detail. Source
  11. With Huawei basically blocked from using Google services and infrastructure, the firm has taken steps to replace Google Maps on its hardware by signing a partnership with TomTom to provide maps, navigation, and traffic data to Huawei apps. Reuters reports that Huawei is entering this partnership with TomTom as the mapping tech company is based in the Netherlands — therefore side-stepping the bans on working with US firms. TomTom will provide the Chinese smartphone manufacturer with mapping, live traffic data, and software on smartphones and tablets. TomTom spokesman Remco Meerstra confirmed to Reuters that the deal had been closed some time ago but had not been made public by the company. This comes as TomTom unveiled plans to move away from making navigation hardware and will focus more heavily on offering software services — making this a substantial step for TomTom and Huawei. While TomTom doesn’t quite match the global coverage and update speed of Google Maps, having a vital portion of it filled by a dedicated navigation and mapping firm is one step that might appease potential global Huawei smartphone buyers. There is no denying the importance of Google app access outside of China but solid replacements could potentially make a huge difference — even more so if they are recognizable by Western audiences. It’s unclear when we may see TomTom pre-installed on Huawei devices but we are sure that this could be easily added by way of an OTA software update. The bigger question remains if people are prepared to switch from Google Maps to TomTom for daily navigation. resource: https://9to5google.com/2020/01/20/huawei-tomtom/
  12. update: this how i do till i post this update 1. disable hardware acceleration 2. Turn off all compatibility settings except for Run This Program As Administrator (point 4 on above article) 3. Stop the Windows Presentation Foundation Font Cache (point 5 on above article). I saw little increase on performance, the workspace almost playable now, but i still find the lag when scroll it.
  13. I have a project with autocad files fire up my Workstation Laptop (Dell Precission 5510) and load CAD data. Holly cr*p, this software run like a snail, 🤣 try to disable Hardware acceleration, yeah much better experience, but still laggy as old Arcgis Pro beta 😂 searching around and found this article: https://knowledge.autodesk.com/support/autocad/troubleshooting/caas/sfdcarticles/sfdcarticles/Optimize-Performance-within-Windows-7-Environments.html?_ga=2.205082898.303799305.1579712200-1066991414.1579712200 didnt have time to try all the suggestion yet, but, hey all GISArea members, do you use Autocad? how to improve your CAD Experience? share with me, 😉
  14. 3dbu


  15. Thanks for the heads-up, Lurker 🙏
  16. thank you with changing the format i solve the problem, initially.
  17. hello Im force to change hdf format to tiff. however it was so time consuming. my raster in hdf format has 4 band but in tiff format it has 1 band. I think changing to tiff format is not the best solution but it works now
  18. January 3, 2020 - Recent Landsat 8 Safehold Update On December 19, 2019 at approximately 12:23 UTC, Landsat 8 experienced a spacecraft constraint which triggered entry into a Safehold. The Landsat 8 Flight Operations Team recovered the satellite from the event on December 20, 2019 (DOY 354). The spacecraft resumed nominal on-orbit operations and ground station processing on December 22, 2019 (DOY 356). Data acquired between December 22, 2019 (DOY 356) and December 31, 2019 (DOY 365) exhibit some increased radiometric striping and minor geometric distortions (see image below) in addition to the normal Operational Land Imager/Thermal Infrared Sensor (OLI/TIRS) alignment offset apparent in Real-Time tier data. Acquisitions after December 31, 2019 (DOY 365) are consistent with pre-Safehold Real-Time tier data and are suitable for remote sensing use where applicable. All acquisitions after December 22, 2019 (DOY 356) will be reprocessed to meet typical Landsat data quality standards after the next TIRS Scene Select Mirror (SSM) calibration event, scheduled for January 11, 2020. Landsat 8 Operational Land Imager acquisition on December 22, 2019 (path 148/row 044) after the spacecraft resumed nominal on-orbit operations and ground station processing. This acquisition demonstrates increased radiometric striping and minor geometric distortions observed in all data acquired between December 22, 2019 and December 31, 2019. All acquisitions after December 22, 2019 will be reprocessed on January 11, 2020 to achieve typical Landsat data quality standards. Data not acquired during the Safehold event are listed below and displayed in purple on the map (click to enlarge). Map displaying Landsat 8 scenes not acquired from Dec 19-22, 2019 Path 207 Rows 160-161 Path 223 Rows 60-178 Path 6 Rows 22-122 Path 22 Rows 18-122 Path 38 Rows 18-122 Path 54 Rows 18-214 Path 70 Rows 18-120 Path 86 Rows 24-110 Path 102 Rows 19-122 Path 118 Rows 18-185 Path 134 Rows 18-133 Path 150 Rows 18-133 Path 166 Rows 18-222 Path 182 Rows 18-131 Path 198 Rows 18-122 Path 214 Rows 34-122 Path 230 Rows 54-179 Path 13 Rows 18-122 Path 29 Rows 20-232 Path 45 Rows 18-133 After recovering from the Safehold successfully, data acquired on December 20, 2019 (DOY 354) and from most of the day on December 21, 2019 (DOY 355) were ingested into the USGS Landsat Archive and marked as "Engineering". These data are still being assessed to determine if they will be made available for download to users through all USGS Landsat data portals. source: https://www.usgs.gov/land-resources/nli/landsat/january-3-2020-recent-landsat-8-safehold-update
  19. Please make my account status active again. I promise I will access this forum regularly. Thank you for your goodness.
  20. Interesting video on How Tos: WebOpenDroneMap is a friendly Graphical User Interfase (GUI) of OpenDroneMap. It enhances the capabilities of OpenDroneMap by providing a easy tool for processing drone imagery with bottoms, process status bars, and a new way to store images. WebODM allows to work by projects, so the user can create different projects and process the related images. As a whole, WebODM in Windows is a implementation of PostgresSQL, Node, Django and OpenDroneMap and Docker. The software instalation requires 6gb of disk space plus Docker. It seem huge but it is the only way to process drone imagery in Windows using just open source software. We definitely see a huge potential of WebODM for the image processing, therefore we have done this tutorial for the installation and we will post more tutorial for the application of WebODM with drone images. For this tutorial you need Docker Toolbox installed on your computer. You can follow this tutorial to get Docker on your pc: https://www.hatarilabs.com/ih-en/tutorial-installing-docker You can visit the WebODM site on GitHub: https://github.com/OpenDroneMap/WebODM Videos The tutorial was split in three short videos. Part 1 https://www.youtube.com/watch?v=AsMSoWAToxE Part 2 https://www.youtube.com/watch?v=8GKx3fz0qgE Part 3 https://www.youtube.com/watch?v=eCZFzaXyMmA
  1. Load more activity
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.