Jump to content

Lurker

Moderators
  • Content Count

    3,991
  • Joined

  • Last visited

  • Days Won

    279

Lurker last won the day on June 24

Lurker had the most liked content!

Community Reputation

2,082 Celebrity

About Lurker

  • Rank
    Associate Professor
  • Birthday 02/13/1983

Profile Information

  • Gender
    Male
  • Location
    INDONESIA
  • Interests
    GIS and Remote Sensing

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. from their official sites : NOAA’s flagship weather model — the Global Forecast System (GFS) — is undergoing a significant upgrade today to include a new dynamical core called the Finite-Volume Cubed-Sphere (FV3). This upgrade will drive global numerical weather prediction into the future with improved forecasts of severe weather, winter storms, and tropical cyclone intensity and track. NOAA research scientists originally developed the FV3 as a tool to predict long-range weather patterns at time frames ranging from multiple decades to interannual, seasonal and subseasonal. In recent years, creators of the FV3 at NOAA’s Geophysical Fluid Dynamics Laboratory expanded it to also become the engine for NOAA’s next-generation operational GFS. “In the past few years, NOAA has made several significant technological leaps into the future – from new satellites in orbit to this latest weather model upgrade,” said Secretary of Commerce Wilbur Ross. “Through the use of this advanced model, the dedicated scientists, forecasters, and staff at NOAA will remain ever-alert for any threat to American lives and property.” The FV3-based GFS brings together the superior dynamics of global climate modeling with day-to-day reliability and speed of operational numerical weather prediction. Additional enhancements to the science that produce rain and snow in the GFS also contribute to the improved forecasting capability of this upgrade. “The significant enhancements to the GFS, along with creating NOAA’s new Earth Prediction Innovation Center, are positioning the U.S. to reclaim international leadership in the global earth-system modeling community,” said Neil Jacobs, Ph.D., acting NOAA administrator. The GFS upgrade underwent rigorous testing led by NOAA’s National Centers for Environmental Prediction (NCEP) Environmental Modeling Center and NCEP Central Operations that included more than 100 scientists, modelers, programmers and technicians from around the country. With real-time evaluations for a year alongside the previous version of the GFS, NOAA carefully documented the strengths of each. When tested against historic weather dating back an additional three years, the upgraded FV3-based GFS performed better across a wide range of weather phenomena. The scientific and performance evaluation shows that the upgraded FV3-based GFS provides results equal to or better than the current global model in many measures. This upgrade establishes the foundation to further advancements in the future as we improve observation quality control, data assimilation, and the model physics. “We are excited about the advancements enabled by the new GFS dynamical core and its prospects for the future,” said Louis W. Uccellini, Ph.D., director, NOAA’s National Weather Service. “Switching out the dynamical core will have significant impact on our ability to make more accurate 1-2 day forecasts and increase the level of accuracy for our 3-7 day forecasts. However, our job doesn't end there — we also have to improve the physics as well as the data assimilation system used to ingest data and initialize the model.” Uccellini explained that NOAA’s work with the National Center for Atmospheric Research to build a common infrastructure between the operational and research communities will help advance the FV3-based GFS beyond changing the core. “This new dynamical core and our work with NCAR will accelerate the transition of research advances into operations to produce even more accurate forecasts in the future,” added Uccellini. Operating a new and sophisticated weather model requires robust computing capacity. In January 2018, NOAA augmented its weather and climate supercomputing systems to increase performance by nearly 50 percent and added 60 percent more storage capacity to collect and process weather, water and climate observations. This increased capacity enabled the parallel testing of the FV3-based GFS throughout the year. The retiring version of the model will no longer be used in operations but will continue to run in parallel through September 2019 to provide model users with data access and additional time to compare performance. source: https://www.noaa.gov/media-release/noaa-upgrades-us-global-weather-forecast-model
  2. Meterology revolves as much around good weather models as it does good weather data, and the core US model is about to receive a long-overdue refresh. NOAA has upgraded its Global Forecast System with a long-in-testing dynamical core, the Finite-Volume Cubed-Sphere (aka FV3). It's the first time the model has been replaced in roughly 40 years, and it promises higher-resolution forecasts, lower computational overhead and more realistic water vapor physics. The results promise to be tangible. NOAA believes there will be a "significant impact" to one- and two-day forceasts, and improve the overall accuracy for forecasts up to a week ahead. It also hopes for further improvements to both the physics as well as the system that ingests data and invokes the weather model. This is on top of previous upgrades to NOAA supercomputers that should provide more capacity. The old model will still hang around through September, although not as a backup -- it's strictly there for data access and performance comparisons. FV3 had been chosen years ago to replace the old core, and it's been in parallel testing for over a year. Not everyone is completely satisfied with the new model. Ars Technica pointed out that the weather community is concerned about surface temperatures that have skewed low, for instance. It should be more accurate overall, though, and that could be crucial for tracking hurricanes, blizzards and other serious weather patterns that can evolve by the hour. source: https://www.engadget.com/2019/06/12/us-weather-forecast-model-update
  3. multifunction casing. you can run 3d games and grating cheese for your hamburger. excelent thought apple as always LOL
  4. We are all already familiar with GPS navigation outdoors and what wonders it does not only for our everyday life, but also for business operations. Outdoor maps, allowing for navigation via car or by foot, have long helped mankind to find even the most remote and hidden places. Increased levels of efficiency, unprecedented levels of control over operational processes, route planning, monitoring of deliveries, safety and security regulations and much more have been made possible. Some places are, however, harder to reach and navigate than others. For instance, places like big indoor areas – universities, hospitals, airports, convention centers or factories, among others. Luckily, that struggle is about to become a thing of the past. So what’s the solution for navigating through and managing complex indoor buildings? Indoor Mapping and Visualization with ArcGIS Indoors The answer is simple – indoor mapping. Indoor mapping is a revolutionary concept that visualizes an indoor venue and spatial data on a digital 2D or 3D map. Showing places, people and assets on a digital map enables solutions such as indoor positioning and navigation. These, in turn, allow for many different use cases that help companies optimize their workflows and efficiencies. Mobile Navigation and Data The idea behind this solution is the same as outdoor navigation, only instead it allows you to see routes and locate objects and people in a closed environment. As GPS signals are not available indoors, different technology solutions based on either iBeacons, WiFi or lighting are used to create indoor maps and enable positioning services. You can plan a route indoors from point A to point B with customized pins and remarks, analyze whether facilities are being used to their full potential, discover new business opportunities, evaluate user behaviors and send them real-time targeted messages based on their location, intelligently park vehicles, and the list goes on! With the help of geolocation, indoor mapping stores and provides versatile real-time data on everything that is happening indoors, including placements and conditions of assets and human movements. This allows for a common operating picture, where all stakeholders share the same level of information and insights into internal processes. Having a centralized mapping system enables effortless navigation through all the assets and keeps facility managers updated on the latest changes, which ultimately improves business efficiency. Just think how many operational insights can be received through visualizations of assets on your customized map – you can monitor and analyze the whole infrastructure and optimize the performance accordingly. How to engage your users/visitors at the right time and place? What does it take to improve security management? Are the workflow processes moving seamlessly? Answers to those and many other questions can be found in an indoor mapping solution. Interactive indoor experiences are no longer a thing of the future, they are here and now. source: https://www.esri.com/arcgis-blog/products/arcgis-indoors/mapping/what-is-indoor-mapping/
  5. found this interesting tutorial : For the last couple years I have been testing out the ever-improving support for parallel query processing in PostgreSQL, particularly in conjunction with the PostGIS spatial extension. Spatial queries tend to be CPU-bound, so applying parallel processing is frequently a big win for us. Initially, the results were pretty bad. With PostgreSQL 10, it was possible to force some parallel queries by jimmying with global cost parameters, but nothing would execute in parallel out of the box. With PostgreSQL 11, we got support for parallel aggregates, and those tended to parallelize in PostGIS right out of the box. However, parallel scans still required some manual alterations to PostGIS function costs, and parallel joins were basically impossible to force no matter what knobs you turned. With PostgreSQL 12 and PostGIS 3, all that has changed. All standard query types now readily parallelize using our default costings. That means parallel execution of: Parallel sequence scans, Parallel aggregates, and Parallel joins!! TL;DR: PostgreSQL 12 and PostGIS 3 have finally cracked the parallel spatial query execution problem, and all major queries execute in parallel without extraordinary interventions. What Changed With PostgreSQL 11, most parallelization worked, but only at much higher function costs than we could apply to PostGIS functions. With higher PostGIS function costs, other parts of PostGIS stopped working, so we were stuck in a Catch-22: improve costing and break common queries, or leave things working with non-parallel behaviour. For PostgreSQL 12, the core team (in particular Tom Lane) provided us with a sophisticated new way to add spatial index functionality to our key functions. With that improvement in place, we were able to globally increase our function costs without breaking existing queries. That in turn has signalled the parallel query planning algorithms in PostgreSQL to parallelize spatial queries more aggressively. Setup In order to run these tests yourself, you will need: PostgreSQL 12 PostGIS 3.0 You’ll also need a multi-core computer to see actual performance changes. I used a 4-core desktop for my tests, so I could expect 4x improvements at best. The setup instructions show where to download the Canadian polling division data used for the testing: pd a table of ~70K polygons pts a table of ~70K points pts_10 a table of ~700K points pts_100 a table of ~7M points We will work with the default configuration parameters and just mess with the max_parallel_workers_per_gather at run-time to turn parallelism on and off for comparison purposes. When max_parallel_workers_per_gather is set to 0, parallel plans are not an option. max_parallel_workers_per_gather sets the maximum number of workers that can be started by a single Gather or Gather Merge node. Setting this value to 0 disables parallel query execution. Default 2. Before running tests, make sure you have a handle on what your parameters are set to: I frequently found I accidentally tested with max_parallel_workers set to 1, which will result in two processes working: the leader process (which does real work when it is not coordinating) and one worker. show max_worker_processes; show max_parallel_workers; show max_parallel_workers_per_gather; Aggregates Behaviour for aggregate queries is still good, as seen in PostgreSQL 11 last year. SET max_parallel_workers = 8; SET max_parallel_workers_per_gather = 4; EXPLAIN ANALYZE SELECT Sum(ST_Area(geom)) FROM pd; Boom! We get a 3-worker parallel plan and execution about 3x faster than the sequential plan. Scans The simplest spatial parallel scan adds a spatial function to the target list or filter clause. SET max_parallel_workers = 8; SET max_parallel_workers_per_gather = 4; EXPLAIN ANALYZE SELECT ST_Area(geom) FROM pd; Boom! We get a 3-worker parallel plan and execution about 3x faster than the sequential plan. This query did not work out-of-the-box with PostgreSQL 11. Gather (cost=1000.00..27361.20 rows=69534 width=8) Workers Planned: 3 -> Parallel Seq Scan on pd (cost=0.00..19407.80 rows=22430 width=8) Joins Starting with a simple join of all the polygons to the 100 points-per-polygon table, we get: SET max_parallel_workers_per_gather = 4; EXPLAIN SELECT * FROM pd JOIN pts_100 pts ON ST_Intersects(pd.geom, pts.geom); Right out of the box, we get a parallel plan! No amount of begging and pleading would get a parallel plan in PostgreSQL 11 Gather (cost=1000.28..837378459.28 rows=5322553884 width=2579) Workers Planned: 4 -> Nested Loop (cost=0.28..305122070.88 rows=1330638471 width=2579) -> Parallel Seq Scan on pts_100 pts (cost=0.00..75328.50 rows=1738350 width=40) -> Index Scan using pd_geom_idx on pd (cost=0.28..175.41 rows=7 width=2539) Index Cond: (geom && pts.geom) Filter: st_intersects(geom, pts.geom) The only quirk in this plan is that the nested loop join is being driven by the pts_100 table, which has 10 times the number of records as the pd table. The plan for a query against the pt_10 table also returns a parallel plan, but with pd as the driving table. EXPLAIN SELECT * FROM pd JOIN pts_10 pts ON ST_Intersects(pd.geom, pts.geom); Right out of the box, we still get a parallel plan! No amount of begging and pleading would get a parallel plan in PostgreSQL 11 Gather (cost=1000.28..85251180.90 rows=459202963 width=2579) Workers Planned: 3 -> Nested Loop (cost=0.29..39329884.60 rows=148129988 width=2579) -> Parallel Seq Scan on pd (cost=0.00..13800.30 rows=22430 width=2539) -> Index Scan using pts_10_gix on pts_10 pts (cost=0.29..1752.13 rows=70 width=40) Index Cond: (geom && pd.geom) Filter: st_intersects(pd.geom, geom) source: http://blog.cleverelephant.ca/2019/05/parallel-postgis-4.html
  6. welcome and enjoy your stay here
  7. hello, retaled to this bug : SAVE THE DATE: GPS WEEK NUMBER ROLLOVER EVENT – APRIL 6TH 2019 IF YOUR NETWORKS OR SYSTEMS USE COORDINATED UNIVERSAL TIME (UTC), APRIL 6TH 2019 IS PROBABLY A DATE WORTH MARKING IN ADVANCE. On, or possibly after, this date, some GPS receivers may start to behave strangely. The data they output may jump backwards in time, resulting in month and year timestamps that are potentially up to 20 years out of date. This is a known issue; in April 2018 the Department of Homeland Security in the United States issued a memo to make GPS users aware of the situation. Any changes, adjustments, or other actions are ultimately the responsibility of the user, so DHS strongly recommends owners and operators of critical infrastructure to prepare for the rollover. This refers to the GPS Week Rollover on Coordinated Universal Time (UTC) derived from GPS devices. The detail … The potential problem revolves around the way that GPS handles the week element of the data that forms part of the navigation signal; specifically the CNAV and MNAV message types. The week number is encoded into the data stream by a 10-bit field. A binary 10-bit word can represent a maximum of 1,024 weeks, which is approximately 19.7 years. Each 19.7 year period is known in GPS terms as an “epoch”. At the end of each epoch the receiver resets the week number to zero and starts counting again – a new epoch begins. The first epoch started when GPS was launched in January 1980; hence the first epoch of GPS time came to an end on 21stAugust 1999. As we approach the end of the second epoch, which will fall on 6th April 2019, we may well see problems caused by the rollover. Some GPS receivers, or other systems that utilise the date and time function, may not be able to cope. Who will be affected? The list is long and varied; some industries come to mind immediately as they are known to use the accurate timing information provided by the GPS constellation. Financial markets, power generating companies, emergency services and industrial control systems may be affected, as well as fixed-line and cellular communications networks. GPS tracking devices installed in a fleet management system to schedule and monitor deliveries could cause system errors if they start to provide location data that is potentially up to 20 years out of date. Since this is the second time a GPS week rollover will occur, many manufacturers will have been aware of it in advance and newer receivers will continue through and beyond the rollover date without issue. Issues could occur if: Receivers have been in operation for more than 10-15 years without firmware updates Receivers are part of a critical timing system, i.e. what would be the impact to the system if the GPS receiver starts to output incorrect UTC data? What can be done? There is an ongoing multi-billion-dollar program of upgrades to improve the overall performance of the Global Position System. Included within this program is a move towards a 13-bit date field to represent the week number. Thirteen binary bits represents around 157 years per epoch. Newer GPS receivers won’t be affected by the current end-of-epoch situation. Notwithstanding that, if unsure the best advice is to check with the manufacturer of the devices you use. This situation won’t affect a receiver’s ability to navigate and/or calculate precise time, but it has the potential to create week, month and year timestamps that are wildly wrong. Applications which rely on GPS data at that level may be seriously affected. Other GNSSs, such as the Russian GLONASS, European Galileo or Chinese BeiDou, are not affected by this problem. Using receivers indoors via a GPS repeater If your receivers are operated indoors by means of a GPS repeater system, the contents of the GPS signal is not affected at all by the repeater. Anything that is transmitted by the GPS satellites is passed transparently through to the interior. Any checks carried out to ascertain the potential effect of the GPS week rollover need not involve the GPS repeater if installed. source : https://www.gps-repeaters.com/blog/gps-week-number-rollover-april-6th-2019/ Okay, this happen my guess only in static occupation, confirmed by our distributor, I had 2 measurement that impacted with this bug. I notice couple things, 1. All data collector when synced with the GPS , the date back to 1999. 2. The Emphemeris data has wrong date (1999). in this case, we use Topcon GR-5 Product, I try to update to the latest firmware and reset the gps but the date still wrong, it sould be back to current date, but I still got 1999 Anyone with same issue? is it possible to correct the faulty data ?
  8. welcome, and enjoy your stay
  9. Topcon Positioning Group’s Dave Henderson offers a rundown on the company’s latest products, including the Falcon 8+ drone, Sirius Pro, MR-2 modular receiver, and B210 and B125 receiver boards, at Xponential 2019. source: https://www.gpsworld.com/topcon-showcases-falcon-8-drone-sirius-pro-and-receiver-boards-at-xponential-2019/
  10. DT Research, the leading designer and manufacturer of purpose-built computing solutions for vertical markets, today announced the DT301X-TR Rugged Tablet, a lightweight military-grade tablet that is purpose-built to enhance the precision for bridge and construction inspections, 3D surveying, mapping of underground utilities, and crime and crash scene reconstruction. With 10.1” high-brightness capacitive touch screen that can easily be read in a wide range of lighting indoors and outdoors, a choice of Intel 8th generation Core i5 or i7 processors, and MIL spec and IP ratings to hold up to real-world hazards, the DT301X-TR performs in many industries and environments. The DT301X-TR integrates the optional Intel® RealSense™ Depth camera which provides real-time 3D imaging to quickly and accurately create measurements for CAD, engineering, design, utility and project management, and crime/crash scene forensics. Scientific grade data, which is important for evidence as well as building plans, is now easier to access and use for specialists and non-credentialed workers alike. With this 3D camera technology, depth perception is integrated to add the most accurate image to make projects stay factual and consistent. The integration of the 3D camera with a rugged handheld tablet improves the mobility and reduces the bulk and limitations of a laser scanner for small, hard-to-reach spaces and brings the measurement, real-time scanning, and positioning together in one device which can also be used to process and transmit the data. Using rugged tablets with 3D technology allows the as-built status of a project to be tracked and documented in real time, reducing the project cycle time, and also allows data to be shared with the owner, general contractor and subs as it is captured. This boost to efficiency and accuracy validation shortens payment cycles as well as improving the overall BIM (Building Information Modeling), getting infrastructure going quickly and getting payments to contractors faster. “The combination of the DT Research rugged tablet with the RealSense depth camera and DotProduct’s Dot3D Pro software enables projects to be quickly set up, tracked, and completed for all staff and tasks whether in the office or on the site. The ease of use these tools bring to 3D workflows can benefit a wide range of applications from construction verification to asset management to crime scene mapping.” says Tom Greaves, chief marketing officer at DotProduct. The DT301X-TR also provides a GNSS multi frequency RTK with carrier phase for real-time mapping and positioning, and supports GPS, GLONASS, BeiDou, Galileo, and QZSS. The optional foldable antenna supports high-accurate measuring field work, which can be measured with RTK GNSS positioning directly, or used to connect to an external antenna for higher precision. Some other data capture options offered on the DT301X-TR besides the 3D camera are a 2D barcode scanner for equipment/location tags, long range Bluetooth for 1000ft range ideal for connecting to Robotic Total Stations and 4G LTE mobile broadband for the latest in high speed communications. Another option is a bright LED light that can be attached to the DT301X-TR and stay consistently on for up to two hours, bringing light to underground infrastructure mapping and scanning. The flexibility for set up and use is enhanced in the DT301X-TR rugged tablet with Microsoft Windows® 10 IoT Enterprise operating systems for convenient integration with existing applications, bringing together the advanced workflow for data capture, accurate positioning and data transmitting. With high capacity 60 or 90 watt hot-swappable batteries, the rugged yet lightweight DT301X-TR keeps working continuously whether in the field, office, or vehicles, complemented with a variety of battery chargers so fullycharged batteries are always available. This rugged tablet gives detailed accuracy combined with the latest 3D camera technology all in one tablet that is rugged and easy to use in the field. Whether at the construction site, mapping underground utilities, or at the freeway crash scene, the cost-effective DT301X-TR is ideal for accurate measurements to enable data-driven decisions, able to travel to wherever the work is. The DT301X-TR Rugged Tablet will be available in May 2019 from DT Research’s authorized resellers and partners. DT Research will be at booth 217 at AEC’s BuildTech show. source: https://gisuser.com/2019/05/purpose-built-rugged-tablet-for-accurate-real-time-measurements/
  11. The state of Alaska is beautiful and wild — no wonder it is called the “Last Frontier”. The land has more than 130 volcanoes that pose a grave threat to the residents. A joint project by National Oceanic and Atmospheric Administration (NOAA) and NASA (National Aeronautics and Space Administration) has given scientists and forecasters a platform to protect people from volcanic ash. This is one of the many case studies highlighted by the space agency on its new website, SpaceforUS, which intends to highlight how NASA has used its earth observation data to better the living conditions of people in all 50 states of America. NASA, for the past six decades, the agency has used interpretations from the space to understand the “Blue Planet” better. By using its fleet of space technologies, it has improved the lives of the people of America. Some 25,000 flights flights fly over Alaskan volcanoes which can be even more hazardous during eruptions and when volcanoes discharge volcanic ash. An joint initiative by NOAA and NASA, the project tracks clouds and guides regulators and airlines. Through this new and interactive website, SpaceforUS, NASA wishes to highlight the innumerable ways in which its earth observations has helped administrators take informed decisions in the areas of public health, disaster response and environmental protection. The site, also being termed NASA’s communication project, explores the stories behind the innovative technology, ground-breaking insights and extraordinary collaborations. Single platform SpaceforUS has a total of 56 stories that illustrate NASA’s science and the impact it has managed to have in all 50 states, including the District of Columbia, Puerto Rico and regions along the Atlantic, Pacific, Gulf of Mexico and the Great Lakes. On the website, readers can browse stories on animals, disasters, energy, health, land and water either by state or by topics. The website showcases the power of earth observation through state-by-state project examples — from guiding pilots around hazardous volcanic ash plumes over Alaska to first responders to devastating hurricanes in North Carolina. During Hurricane Rita, NASA created high quality satellite images that identified power averages guiding first responders for life saving aid. On SpaceforUS, each click brings to the readers a story about the different ways in which people are using NASA data in their day-to-day lives. Open Data To all those seeking solutions to imperative global issues, NASA also provides free and open earth observation data on issues related to changing freshwater availability, food security and human health. NASA’s Applied Sciences platform provides financial assistance to those projects that facilitate innovative uses of NASA Earth science data to ensure that well-informed decisions are made to not only strengthen America’s economy, but to also improve the quality of life globally. In the state of Arizona, NASA Earth observations have already identified the hottest areas around the Phoenix metropolitan area making civic planning and environmental monitoring a lot easier. website : https://www.nasa.gov/SpaceforUS/
  12. Tallysman, a leading manufacturer of high performance GNSS and Iridium antennas, is announcing the first three products of a new range of helical antennas to be introduced in 2019. The first three models of the Tallysman helical family are: HC871 (25g) – A housed, dual band, active GNSS antenna, supporting GPS L1/L2, GLONASS G1/G2, Galileo E1, and BeiDou B1. HC872 (36g) – A housed, dual band, active GNSS antenna, supporting GPS L1/L2, GLONASS G1/G2, Galileo E1, BeiDou B1, and L-Band services. HC600 (18g) – A housed, passive Iridium antenna. The active GNSS helical antennas feature an industry-leading low current, Low Noise Amplifier (LNA), and include integrated low-loss pre-filters, to protect against harmonic interference from high amplitude interfering signals, such as 700MHz band LTE and other near in-band cellular signals. Available in both housed and embedded OEM versions, the lightweight Tallysman helical antennas have excellent axial ratios, making them ideal for a variety of high-precision Unmanned Aerial Vehicle (UAV) applications. Tallysman is announcing the availability of its first three models of the helical family, with additional of models to be announced in Q3 2019 and onward. sources: https://www.geospatialworld.net/news/tallysman-introduces-lightweight-helical-gnss-and-iridium-antennas/
  13. Velodyne Lidar, Inc has announced an agreement with Nikon Corporation, under which Sendai Nikon Corporation, a Nikon subsidiary, will manufacture LiDAR sensors for Velodyne with plans to start mass production in the second half of 2019. The partnership cements Velodyne’s manufacturing plan and expands its lead in the global LiDAR sensor market. “Mass production of our high-performance lidar sensors is key to advancing Velodyne’s immediate plans to expand sales in North America, Europe, and Asia,” said Marta Hall, President and CBDO, Velodyne Lidar. “With this partnership, Velodyne affirms its leadership role in designing, producing, and selling lidar for worldwide implementation. For years, Velodyne has been perfecting lidar technology to produce thousands of lidar units for autonomous vehicles (AVs) and advanced driver assistance systems (ADAS). It is our goal to produce lidar in the millions of units with manufacturing partners such as Nikon.” Velodyne is the leading supplier of lidar sensors to the automotive industry with more than 250 customers globally. Beyond AVs and ADAS, Velodyne will leverage Nikon’s mass manufacturing prowess as lidar sales expand within other emerging markets. Accelerated by its partnership with Nikon, Velodyne’s low cost lidar solutions will benefit a range of business segments, including robotics, security, mapping, agriculture, and unmanned aerial vehicles (UAVs). “Working with Nikon, an expert in precision manufacturing, is a major step toward lowering the cost of our lidar products. Nikon is notable for expertly mass-producing cameras while retaining high standards of performance and uncompromising quality. Together, Velodyne and Nikon will apply the same attention to detail and quality to the mass production of lidar. Lidar sensors will retain the highest standards while at the same time achieving a price that will be more affordable for customers around the world,” said Hall. Nikon invested USD 25 million in Velodyne last year and now aims to combine Nikon’s optical and precision technologies with Velodyne’s lidar sensor technology. Since the investment, both companies have been investigating a business alliance that includes collaboration in technology development and manufacturing. This manufacturing agreement represents the initial phase of the Velodyne/Nikon business alliance. The companies will continue to investigate further areas of a wide-ranging and multifaceted business alliance. sources: https://www.geospatialworld.net/news/velodyne-nikon-announce-manufacturing-agreement-for-production-of-lidar-sensors/
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.