Jump to content
Ornithology Exchange

About This Group

This group provides support for those using geolocator technology to study animal movement and behavior.
  1. What's new in this group
  2. Hi, I am analysing GLS data and got errors using lightImage() and preprocessLight() functions. The error I get is following (actually the same for both cases): TwGeos::lightImage(raw, offset = offset, zlim = c(0, 64), main = gls_sel) Error in if (as.numeric(tmin) > as.numeric(date[1])) tmin <- tmin - 24 * : argument is of length zero twl <- preprocessLight(raw, threshold = threshold, offset = offset, lmax = 64, # max. light value gr.Device = "x11") Error in if (as.numeric(tmin) > as.numeric(date[1])) tmin <- tmin - 24 * : argument is of length zero In addition: Warning messages: 1: In min(tagdata$Date) : no non-missing arguments to min; returning Inf 2: In max(tagdata$Date) : no non-missing arguments to max; returning -Inf I was able to get the lightImage at the beginning, but later for some reason I only got the error message. Interestingly, I can plot the light image using the tsimage() function with the same data: tsimage(date = raw$dtime, y = raw$lig, offset = offset, ylab = "Hour local [GMT + 10h]", main = gls_sel) I tried it on three versions of R (3.6.3, 4.0.2, 4.1.0) on different computers all running on Mac, also on data from different tracks, getting the same error message. Would be very thankful if you could help me to find out what may cause the errors. Zuzana
  3. Hi! Your question is a few months old already so I hope you managed fixed the issue already, otherwise it seems to me that it could be a decimal separator issue. In the example you shared, the separator for the temperature is a coma rather than a point and I imagine that the "sep" parameter of the read.table function is set as coma to read csv type files (which seems to be the case of yours). So the function tries to assign 5 values to only 4 headers. This decimal separator issue is common with computers set in certain languages (e.g., french decimal separator is a coma). Hope it helps
  4. Hi Eldar, thank you for a reply. I get overlaping median dates or they seem quite unrealistic (like those in first post where departure date is 11. October and arrival date to next stopover (over 1000 km) is 12. October. I tried adjusting the probability cutoff. The maximum I was able to set was 0.3. If I set 0.5 I get message that the bird didn't move.
  5. Hi Beata, What kind of overlapping dates you get? It is likely that dates will overlap at the ends - around 90-95% percentile. Try adjusting prob.cutoff to e.g. 0.5 in the stationary.migration.summary. I think you should get the same results as from find.times.distribution then. Hope this helps, Eldar
  6. Hi, before calling make.prerun.object you should load saved calibration file for a tag that worked (load(file= path to *Calibration.RData) I think your calibration period is not properly set. Now it is 10 years used for calibration... You should use here only periods with known location of tag.
  7. Hi! I am trying to open my .tem files from MK3 geolocators. The problem is that I don't know how, I downloaded the .tem file with the temperature with comas and not points. So, every time I try to open my file I get an error: > d.deg_prim <- readTem2(paste0(tagdata,".tem"), skip = 1) Error in read.table(file = file, header = header, sep = sep, quote = quote, : more columns than column names So looks my .tem file: 255,19,-0,499986, ok,06/04/19 12:17:54,43561.512431,12,625000 ok,06/04/19 13:13:42,43561.551181,12,750000 ok,06/04/19 14:20:45,43561.597743,12,750000 ok,07/04/19 11:41:48,43562.487361,12,500000 Any ideas of how can I solve this problem? I tried to change this from Excel but then the file can't be saved as .tem file. Thanks!!!!
  8. Hi all, My concern is related to posts on “plot_slopes_by_location” . I am currently working on GLS data (old BAS tags) on seabirds (albatross) deployed for 2-4 yrs and some of them are circumpolar navigating during the non-breeding period. 1) The first issue is that I do not have properly calibration data, then I tried to use approximate calibration from a very short period, different loggers et site ~ 80km from the deployment site. I see that you suggest to use the calibration extracted from a tag that worked, but I failed to understand how you do that. 2) using my approximate calibration data, I process as follow: start=c(77.53, -38.71) log.light.borders=c(1.5, 9) # default values for Intigeo tag log.irrad.borders=c(-3, 3) # default values for Intigeo tag Calibration.periods<-data.frame(calibration.start=as.POSIXct("2010-01-01"), calibration.stop=as.POSIXct("2020-01-01"), lon=start[1], lat=start[2]) calibration.parameters<-get.calibration.parameters(Calibration.periods, Proc.data, model.ageing=F, log.light.borders=log.light.borders, log.irrad.borders=log.irrad.borders) then I tried both make.calibration options without any changes in the results: Calibration<-make.calibration(Proc.data, Calibration.periods, likelihood.correction="auto") Calibration<-make.calibration(Proc.data, Calibration.periods, likelihood.correction=FALSE, model.ageing=TRUE, plot.final = T) As you can see in the attached plot, it failed to estimate slope in early 2014 (corresponding to the period where the bird cross the change of datetime line). Thus I run the end of the script: all.in<-make.prerun.object(FLightR.data, Grid, start=c(77.53, -38.71), Calibration=Calibration,threads=1, M.mean=750) # reduce to one core=node nParticles=1e4 # for test ~5-6h pour n=2529 Result<-run.particle.filter(all.in, threads=1, nParticles=nParticles, known.last=TRUE, precision.sd=25, check.outliers=F, b=1700) But it resulted on erroneous location estimates for the ending part of the track. I try also to cut my original file in before/after files based on the crossing of datetime line without any difference. Thanks Best Karine
  9. Hi, I am performing gls analysis of Intigeo devices in FlightR. When I use stationary.migration.summary function I get overlapping dates for stationary period. So I decided to use find.times.distribution to find arrival/departure dates to/from stationary periods calculated by stationary.migration.summary. Now I am wondering what key should I use to unify this approach across all my birds? Any advise? Here is a piece of my results of two stationary periods. Meanlat SDlat Meanlon SDlon Dist2 Arrival.Q.50 Departure.Q.50 25.595678 2.4360796 -12.919237 1.0270114 453.1756 2019-09-28 07:40:58 2019-10-11 20:53:39 19.598708 2.1146732 -3.262447 0.6437934 1310.7202 2019-10-12 00:57:31 2020-03-16 08:01:10
  10. Bonjour, I am working with geolocators data collected with Intigeo C65 devices on seabirds breeding in the Arctic (higher red dot on map1) and migrating in southern Atlantic in winter. Parts of the migration occurs during spring and fall equinoxes. I used a rooftop cablibration made in southern Quebec (lower red dot on map1) for about 14 days and I used the findHEZenith function to calculate a new zenith angle on the wintering area. Then I set a list of Zenith changing depending on the period of the year. All my script is running, but for some birds, I have a strange movement up-north after a first staging period. When I create the initial path (thresholdPath function), everything seems to be okay, but when I run the estelleMetropolis function in 3 steps (burn-in, tuning, final runs), no matter how many iterations I do and no matter how many chains I set, the up-north movement appears. On the path on map2, the blue part is ± 10 days around equinoxes, gray is ± 15 days and black ± 20 days. I tried changing all the parameters one by one, but the problem persists. Any idea what could cause this issue? Merci, Yannick
  11. Hi Melina, please make sure that there are no NA´s in the date: sum(is.na(data$Date)) If this is >0 than this is the fault and you need to remove these rows. Cheers, Simeon
  12. Hi everyone! I am running the preprocessLight of my geolocators data and I have the following error in some of my devices: Error in seq.default(as.numeric(tmin) + dt/2, as.numeric(tmax) - dt/2, : 'from' must be of length 1 My data spans not for more than one year but it includes different years (2019 and 2020). I have tried to subset the .lig original data so that it includes only one year, but it doesn't work! Any idea? Someone with the same problem? Thanks !!! MElina
  13. Hi Eli, Everything works for me at the moment, but after updating R and ggmap to the newest versions. Stamenmaps also fail a lot, and thus are hard to use within a function. I will wait for something more reliable before changing a function myself, but everyone is welcome to write a map_FLightR_stamenmap() Cheers, Eldar
  14. I went around the block with google maps for a couple of hours and could not get past this cryptic error message when trying to download a simple example map: ggmap::register_google(key = APIkey) map <- get_map(location = "texas", zoom = 6, source = "stamen") Source : https://maps.googleapis.com/maps/api/staticmap?center=texas&zoom=6&size=640x640&scale=2&maptype=terrain&key=xxx Error in aperm.default(map, c(2, 1, 3)) : invalid first argument, must be an array In addition: Warning message: In get_googlemap(center = location, zoom = zoom, filename = filename) : HTTP 400 Bad Request Others have had this issue and some have solved it--there are some suggestions on stackoverflow--but not me. I tried a bunch of enabling and management options for static maps API, but nothing worked. Maybe something has changed recently with google or ggmap?? Anyway, I'm now building my maps from scratch using get_stamenmap(). Might be a better future option for map.FLightR.ggmap().
  15. Hi, I am trying to use TwGeos::preprocessLight() to identify twilights in light data collected by Lotek LAT280 tags. The below is the log of the "Light Intensity" variable. This approach doesn't seem to give great definition between the dark and light sections and makes picking a threshold challenging. Does anyone have any suggestions for a transformation, a threshold, or an alternative approach with the data from this tag? Thanks!
  16. Hi, I'm analysing some geolocator data from a common sandpiper using the geolight package. When I run my analysis I get a weird migration pattern, the individual appears to stop in two locations (one of which is in the sea) before settling in West Africa. I'm not sure whether these are true movements from west to east or if something else is going on. Does anyone have any suggestions? These movements are not around the equinoxes either. The output of siteMap() and schedule() are attached. Thanks, Thomas ADplot.tiff AD_migration_schedule.csv
  17. Hi Ana, these cut-off probs just measure how many particles movet between time t and t+1. 0.1 will say that there was movement between two periods if at least 10% of particles moved >25 km. 0.5 would mean 50% chance. Generally you better make it higher than lower. There is so far no established approach on how to select a proper value here. Hope this helps, Eldar
  18. Hi all, I estimated stopover locations by using 0.1, 0.2, and 0.3 cutt-off prob. Results are quite consistent specially during the first part of migration. Can the 0.1, 0.2, and 0.3 values be translated into something more "meaningful"? Like km? I'm not quite sure how to explain these three values. Many thanks, Ana
  19. Finally, an almost complete user`s guide for geolocator analyses. Check out the paper: https://besjournals.onlinelibrary.wiley.com/doi/10.1111/1365-2656.13036 As well as the online manual: https://geolocationmanual.vogelwarte.ch/ Lisovski_et_al-2019-JAnimEcol.pdf
  20. So you can install it directly from there with the install.packages(). 0.4.9 version introduces FLightR2Movebank( ) function that saves result formovebank upload. The latest version is on GitHub, as always.
  21. Sorry - I am not sure what is happening.. are you working on a mac? If not try gr.Device = "default"
  22. Thanks for your help, Eldar! Distance does seem to be key. Increasing max distance in particle.filter has improved the return track. Now working on adjusting mean and sd. Forgive a beginner question, but is there an easy way to estimate appropriate values? I have the result of a particle.filter run that produced an OK-looking track, but not quite sure what I should use. Should I include all movements in my estimates? These are birds that remain in a relatively small area for most of the year, but travel long distances quickly during migration...
  23. Hi Simeon, Thank you so much for responding. I checked my NA's and didn't get any. summary(raw) Date Light Min. :2013-02-22 00:00:02 Min. : 0.0002 1st Qu.:2013-04-28 17:58:47 1st Qu.: 0.0002 Median :2013-07-03 11:57:32 Median : 6.4847 Mean :2013-07-03 11:57:32 Mean : 5.1559 3rd Qu.:2013-09-07 05:56:17 3rd Qu.: 8.8506 Max. :2013-11-11 23:55:02 Max. :11.2176 I set the threshold to -.6, .6, 1, 2. I went through the steps to select the beginning, ending date and the darkest spot in the no light area in order for the package to select the sunrise and sunsets. I accepted all the other screens and then the error keeps happening particularly when I hit (a)accept for stage depicted in the attached image.
  24. Hi Jenny, 1. You could try increasing the distance they are allowed to fly. The maximum distance can be fixed in particle.filter and the mean and sd in make.prerun.object. 2. Try running first without any spatial constrains. It sometimes happen that particle filter just cannot find a proper solution if bir migrated to far every day for a long period. Hope this helps, Eldar
  25.  
×
×
  • Create New...