Compare commits
73 Commits
Author | SHA1 | Date |
---|---|---|
jan | 8ef52f8f59 | 2 years ago |
jan | 8d0746b263 | 2 years ago |
jan | fc2ee05f77 | 2 years ago |
jan | e3099ca35c | 2 years ago |
jan | 19351bd487 | 2 years ago |
jan | 1732084856 | 2 years ago |
jan | 334a49a309 | 2 years ago |
jan | 95f4f889da | 2 years ago |
jan | ddf911e6a7 | 2 years ago |
jan | 6ebfc7f50a | 2 years ago |
jan | 0e1d9e3cad | 2 years ago |
jan | 9911226a76 | 2 years ago |
jan | 97475dbbff | 2 years ago |
jan | eec6f84806 | 2 years ago |
jan | 86e80d52c9 | 2 years ago |
jan | 86efd52bf2 | 2 years ago |
jan | 60445d70c5 | 2 years ago |
jan | b1001b174e | 2 years ago |
jan | e2e9e3dbb3 | 2 years ago |
jan | 7e8eb46c8e | 2 years ago |
jan | 262141751f | 2 years ago |
jan | a22c46ba14 | 2 years ago |
jan | 37d4dde1d6 | 2 years ago |
jan | 62667ef139 | 2 years ago |
jan | b26ca150a9 | 2 years ago |
jan | d919f89082 | 2 years ago |
jan | 64db38c291 | 2 years ago |
jan | e4216b3572 | 2 years ago |
jan | a4b878fc8f | 2 years ago |
jan | 7e1f42b8b9 | 2 years ago |
jan | ebb9f5bd1f | 2 years ago |
jan | 35c9e84302 | 2 years ago |
jan | c8925df68d | 2 years ago |
jan | 4e1b5d6389 | 2 years ago |
jan | 8393048957 | 2 years ago |
jan | 34a5dfae19 | 2 years ago |
jan | 438eda09cd | 2 years ago |
jan | 36f4701557 | 2 years ago |
jan | 0292779ca5 | 2 years ago |
jan | 4a81cab11c | 2 years ago |
jan | 4b24aa9582 | 2 years ago |
jan | 35c3014dee | 2 years ago |
jan | 35d9ee697b | 2 years ago |
jan | ca7c32ee64 | 2 years ago |
jan | 6d5c821d3a | 2 years ago |
jan | b4ee1b9ff6 | 2 years ago |
jan | fda76ed670 | 2 years ago |
jan | 0529dd9a29 | 2 years ago |
jan | 379b38046d | 2 years ago |
jan | 24851046f2 | 2 years ago |
jan | e4825ca14c | 2 years ago |
jan | 92672ae74c | 2 years ago |
Norwin | f69cf62b27 | 5 years ago |
Norwin | b69e5dc57f | 6 years ago |
Norwin | 4b01bbbee1 | 6 years ago |
Norwin | 9ddc077bfd | 6 years ago |
Norwin | c618907853 | 6 years ago |
Norwin | 3e56cd1a0e | 6 years ago |
Norwin | 8936ff270c | 6 years ago |
Norwin | e853430c8e | 6 years ago |
Norwin | e37f572a94 | 6 years ago |
Norwin | 32d0cceb28 | 6 years ago |
Norwin | 92cbbcbfc7 | 6 years ago |
Norwin | ee491673fa | 6 years ago |
Norwin | ddc6289ce3 | 6 years ago |
Norwin | de3a05bf97 | 6 years ago |
Norwin | 8d515a5fb0 | 6 years ago |
Norwin | 18a698b375 | 6 years ago |
Norwin | 4d33fa9029 | 6 years ago |
Norwin | 80dc58a298 | 6 years ago |
Norwin | c89cd274a5 | 6 years ago |
Norwin | abcfbf5910 | 6 years ago |
Norwin | 33a9c42e54 | 6 years ago |
@ -1,25 +1,20 @@
|
||||
# Contributor Code of Conduct
|
||||
|
||||
As contributors and maintainers of this project, we pledge to respect all people who
|
||||
contribute through reporting issues, posting feature requests, updating documentation,
|
||||
submitting pull requests or patches, and other activities.
|
||||
contribute through any means.
|
||||
|
||||
We are committed to making participation in this project a harassment-free experience for
|
||||
everyone, regardless of level of experience, gender, gender identity and expression,
|
||||
sexual orientation, disability, personal appearance, body size, race, ethnicity, age, or religion.
|
||||
everyone, regardless of their level of experience and personal or cultural traits.
|
||||
|
||||
Examples of unacceptable behavior by participants include the use of sexual language or
|
||||
imagery, derogatory comments or personal attacks, trolling, public or private harassment,
|
||||
insults, or other unprofessional conduct.
|
||||
Examples of unacceptable behavior by participants include derogatory comments,
|
||||
personal attacks, and trolling, both in public or private.
|
||||
|
||||
Project maintainers have the right and responsibility to remove, edit, or reject comments,
|
||||
commits, code, wiki edits, issues, and other contributions that are not aligned to this
|
||||
Code of Conduct. Project maintainers who do not follow the Code of Conduct may be removed
|
||||
from the project team.
|
||||
Project maintainers have the right and responsibility to remove, edit, or reject any
|
||||
contributions that are not aligned to this Code of Conduct. Project maintainers who
|
||||
do not follow the Code of Conduct may be removed from the project team.
|
||||
|
||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by
|
||||
opening an issue or contacting one or more of the project maintainers.
|
||||
|
||||
This Code of Conduct is adapted from the Contributor Covenant
|
||||
(http:contributor-covenant.org), version 1.0.0, available at
|
||||
http://contributor-covenant.org/version/1/0/0/
|
||||
This Code of Conduct is adapted from the [Contributor Covenant version 1.0.0](http://contributor-covenant.org/version/1/0/0/).
|
||||
|
||||
|
@ -1,339 +0,0 @@
|
||||
GNU GENERAL PUBLIC LICENSE
|
||||
Version 2, June 1991
|
||||
|
||||
Copyright (C) 1989, 1991 Free Software Foundation, Inc., <http://fsf.org/>
|
||||
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The licenses for most software are designed to take away your
|
||||
freedom to share and change it. By contrast, the GNU General Public
|
||||
License is intended to guarantee your freedom to share and change free
|
||||
software--to make sure the software is free for all its users. This
|
||||
General Public License applies to most of the Free Software
|
||||
Foundation's software and to any other program whose authors commit to
|
||||
using it. (Some other Free Software Foundation software is covered by
|
||||
the GNU Lesser General Public License instead.) You can apply it to
|
||||
your programs, too.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
this service if you wish), that you receive source code or can get it
|
||||
if you want it, that you can change the software or use pieces of it
|
||||
in new free programs; and that you know you can do these things.
|
||||
|
||||
To protect your rights, we need to make restrictions that forbid
|
||||
anyone to deny you these rights or to ask you to surrender the rights.
|
||||
These restrictions translate to certain responsibilities for you if you
|
||||
distribute copies of the software, or if you modify it.
|
||||
|
||||
For example, if you distribute copies of such a program, whether
|
||||
gratis or for a fee, you must give the recipients all the rights that
|
||||
you have. You must make sure that they, too, receive or can get the
|
||||
source code. And you must show them these terms so they know their
|
||||
rights.
|
||||
|
||||
We protect your rights with two steps: (1) copyright the software, and
|
||||
(2) offer you this license which gives you legal permission to copy,
|
||||
distribute and/or modify the software.
|
||||
|
||||
Also, for each author's protection and ours, we want to make certain
|
||||
that everyone understands that there is no warranty for this free
|
||||
software. If the software is modified by someone else and passed on, we
|
||||
want its recipients to know that what they have is not the original, so
|
||||
that any problems introduced by others will not reflect on the original
|
||||
authors' reputations.
|
||||
|
||||
Finally, any free program is threatened constantly by software
|
||||
patents. We wish to avoid the danger that redistributors of a free
|
||||
program will individually obtain patent licenses, in effect making the
|
||||
program proprietary. To prevent this, we have made it clear that any
|
||||
patent must be licensed for everyone's free use or not licensed at all.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
GNU GENERAL PUBLIC LICENSE
|
||||
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
|
||||
|
||||
0. This License applies to any program or other work which contains
|
||||
a notice placed by the copyright holder saying it may be distributed
|
||||
under the terms of this General Public License. The "Program", below,
|
||||
refers to any such program or work, and a "work based on the Program"
|
||||
means either the Program or any derivative work under copyright law:
|
||||
that is to say, a work containing the Program or a portion of it,
|
||||
either verbatim or with modifications and/or translated into another
|
||||
language. (Hereinafter, translation is included without limitation in
|
||||
the term "modification".) Each licensee is addressed as "you".
|
||||
|
||||
Activities other than copying, distribution and modification are not
|
||||
covered by this License; they are outside its scope. The act of
|
||||
running the Program is not restricted, and the output from the Program
|
||||
is covered only if its contents constitute a work based on the
|
||||
Program (independent of having been made by running the Program).
|
||||
Whether that is true depends on what the Program does.
|
||||
|
||||
1. You may copy and distribute verbatim copies of the Program's
|
||||
source code as you receive it, in any medium, provided that you
|
||||
conspicuously and appropriately publish on each copy an appropriate
|
||||
copyright notice and disclaimer of warranty; keep intact all the
|
||||
notices that refer to this License and to the absence of any warranty;
|
||||
and give any other recipients of the Program a copy of this License
|
||||
along with the Program.
|
||||
|
||||
You may charge a fee for the physical act of transferring a copy, and
|
||||
you may at your option offer warranty protection in exchange for a fee.
|
||||
|
||||
2. You may modify your copy or copies of the Program or any portion
|
||||
of it, thus forming a work based on the Program, and copy and
|
||||
distribute such modifications or work under the terms of Section 1
|
||||
above, provided that you also meet all of these conditions:
|
||||
|
||||
a) You must cause the modified files to carry prominent notices
|
||||
stating that you changed the files and the date of any change.
|
||||
|
||||
b) You must cause any work that you distribute or publish, that in
|
||||
whole or in part contains or is derived from the Program or any
|
||||
part thereof, to be licensed as a whole at no charge to all third
|
||||
parties under the terms of this License.
|
||||
|
||||
c) If the modified program normally reads commands interactively
|
||||
when run, you must cause it, when started running for such
|
||||
interactive use in the most ordinary way, to print or display an
|
||||
announcement including an appropriate copyright notice and a
|
||||
notice that there is no warranty (or else, saying that you provide
|
||||
a warranty) and that users may redistribute the program under
|
||||
these conditions, and telling the user how to view a copy of this
|
||||
License. (Exception: if the Program itself is interactive but
|
||||
does not normally print such an announcement, your work based on
|
||||
the Program is not required to print an announcement.)
|
||||
|
||||
These requirements apply to the modified work as a whole. If
|
||||
identifiable sections of that work are not derived from the Program,
|
||||
and can be reasonably considered independent and separate works in
|
||||
themselves, then this License, and its terms, do not apply to those
|
||||
sections when you distribute them as separate works. But when you
|
||||
distribute the same sections as part of a whole which is a work based
|
||||
on the Program, the distribution of the whole must be on the terms of
|
||||
this License, whose permissions for other licensees extend to the
|
||||
entire whole, and thus to each and every part regardless of who wrote it.
|
||||
|
||||
Thus, it is not the intent of this section to claim rights or contest
|
||||
your rights to work written entirely by you; rather, the intent is to
|
||||
exercise the right to control the distribution of derivative or
|
||||
collective works based on the Program.
|
||||
|
||||
In addition, mere aggregation of another work not based on the Program
|
||||
with the Program (or with a work based on the Program) on a volume of
|
||||
a storage or distribution medium does not bring the other work under
|
||||
the scope of this License.
|
||||
|
||||
3. You may copy and distribute the Program (or a work based on it,
|
||||
under Section 2) in object code or executable form under the terms of
|
||||
Sections 1 and 2 above provided that you also do one of the following:
|
||||
|
||||
a) Accompany it with the complete corresponding machine-readable
|
||||
source code, which must be distributed under the terms of Sections
|
||||
1 and 2 above on a medium customarily used for software interchange; or,
|
||||
|
||||
b) Accompany it with a written offer, valid for at least three
|
||||
years, to give any third party, for a charge no more than your
|
||||
cost of physically performing source distribution, a complete
|
||||
machine-readable copy of the corresponding source code, to be
|
||||
distributed under the terms of Sections 1 and 2 above on a medium
|
||||
customarily used for software interchange; or,
|
||||
|
||||
c) Accompany it with the information you received as to the offer
|
||||
to distribute corresponding source code. (This alternative is
|
||||
allowed only for noncommercial distribution and only if you
|
||||
received the program in object code or executable form with such
|
||||
an offer, in accord with Subsection b above.)
|
||||
|
||||
The source code for a work means the preferred form of the work for
|
||||
making modifications to it. For an executable work, complete source
|
||||
code means all the source code for all modules it contains, plus any
|
||||
associated interface definition files, plus the scripts used to
|
||||
control compilation and installation of the executable. However, as a
|
||||
special exception, the source code distributed need not include
|
||||
anything that is normally distributed (in either source or binary
|
||||
form) with the major components (compiler, kernel, and so on) of the
|
||||
operating system on which the executable runs, unless that component
|
||||
itself accompanies the executable.
|
||||
|
||||
If distribution of executable or object code is made by offering
|
||||
access to copy from a designated place, then offering equivalent
|
||||
access to copy the source code from the same place counts as
|
||||
distribution of the source code, even though third parties are not
|
||||
compelled to copy the source along with the object code.
|
||||
|
||||
4. You may not copy, modify, sublicense, or distribute the Program
|
||||
except as expressly provided under this License. Any attempt
|
||||
otherwise to copy, modify, sublicense or distribute the Program is
|
||||
void, and will automatically terminate your rights under this License.
|
||||
However, parties who have received copies, or rights, from you under
|
||||
this License will not have their licenses terminated so long as such
|
||||
parties remain in full compliance.
|
||||
|
||||
5. You are not required to accept this License, since you have not
|
||||
signed it. However, nothing else grants you permission to modify or
|
||||
distribute the Program or its derivative works. These actions are
|
||||
prohibited by law if you do not accept this License. Therefore, by
|
||||
modifying or distributing the Program (or any work based on the
|
||||
Program), you indicate your acceptance of this License to do so, and
|
||||
all its terms and conditions for copying, distributing or modifying
|
||||
the Program or works based on it.
|
||||
|
||||
6. Each time you redistribute the Program (or any work based on the
|
||||
Program), the recipient automatically receives a license from the
|
||||
original licensor to copy, distribute or modify the Program subject to
|
||||
these terms and conditions. You may not impose any further
|
||||
restrictions on the recipients' exercise of the rights granted herein.
|
||||
You are not responsible for enforcing compliance by third parties to
|
||||
this License.
|
||||
|
||||
7. If, as a consequence of a court judgment or allegation of patent
|
||||
infringement or for any other reason (not limited to patent issues),
|
||||
conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot
|
||||
distribute so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you
|
||||
may not distribute the Program at all. For example, if a patent
|
||||
license would not permit royalty-free redistribution of the Program by
|
||||
all those who receive copies directly or indirectly through you, then
|
||||
the only way you could satisfy both it and this License would be to
|
||||
refrain entirely from distribution of the Program.
|
||||
|
||||
If any portion of this section is held invalid or unenforceable under
|
||||
any particular circumstance, the balance of the section is intended to
|
||||
apply and the section as a whole is intended to apply in other
|
||||
circumstances.
|
||||
|
||||
It is not the purpose of this section to induce you to infringe any
|
||||
patents or other property right claims or to contest validity of any
|
||||
such claims; this section has the sole purpose of protecting the
|
||||
integrity of the free software distribution system, which is
|
||||
implemented by public license practices. Many people have made
|
||||
generous contributions to the wide range of software distributed
|
||||
through that system in reliance on consistent application of that
|
||||
system; it is up to the author/donor to decide if he or she is willing
|
||||
to distribute software through any other system and a licensee cannot
|
||||
impose that choice.
|
||||
|
||||
This section is intended to make thoroughly clear what is believed to
|
||||
be a consequence of the rest of this License.
|
||||
|
||||
8. If the distribution and/or use of the Program is restricted in
|
||||
certain countries either by patents or by copyrighted interfaces, the
|
||||
original copyright holder who places the Program under this License
|
||||
may add an explicit geographical distribution limitation excluding
|
||||
those countries, so that distribution is permitted only in or among
|
||||
countries not thus excluded. In such case, this License incorporates
|
||||
the limitation as if written in the body of this License.
|
||||
|
||||
9. The Free Software Foundation may publish revised and/or new versions
|
||||
of the General Public License from time to time. Such new versions will
|
||||
be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the Program
|
||||
specifies a version number of this License which applies to it and "any
|
||||
later version", you have the option of following the terms and conditions
|
||||
either of that version or of any later version published by the Free
|
||||
Software Foundation. If the Program does not specify a version number of
|
||||
this License, you may choose any version ever published by the Free Software
|
||||
Foundation.
|
||||
|
||||
10. If you wish to incorporate parts of the Program into other free
|
||||
programs whose distribution conditions are different, write to the author
|
||||
to ask for permission. For software which is copyrighted by the Free
|
||||
Software Foundation, write to the Free Software Foundation; we sometimes
|
||||
make exceptions for this. Our decision will be guided by the two goals
|
||||
of preserving the free status of all derivatives of our free software and
|
||||
of promoting the sharing and reuse of software generally.
|
||||
|
||||
NO WARRANTY
|
||||
|
||||
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
|
||||
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
|
||||
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
|
||||
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
|
||||
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
|
||||
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
|
||||
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
|
||||
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
|
||||
REPAIR OR CORRECTION.
|
||||
|
||||
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
|
||||
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
|
||||
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
|
||||
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
|
||||
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
|
||||
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
|
||||
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
|
||||
POSSIBILITY OF SUCH DAMAGES.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
convey the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
{description}
|
||||
Copyright (C) {year} {fullname}
|
||||
|
||||
This program is free software; you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation; either version 2 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU General Public License along
|
||||
with this program; if not, write to the Free Software Foundation, Inc.,
|
||||
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If the program is interactive, make it output a short notice like this
|
||||
when it starts in an interactive mode:
|
||||
|
||||
Gnomovision version 69, Copyright (C) year name of author
|
||||
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
||||
This is free software, and you are welcome to redistribute it
|
||||
under certain conditions; type `show c' for details.
|
||||
|
||||
The hypothetical commands `show w' and `show c' should show the appropriate
|
||||
parts of the General Public License. Of course, the commands you use may
|
||||
be called something other than `show w' and `show c'; they could even be
|
||||
mouse-clicks or menu items--whatever suits your program.
|
||||
|
||||
You should also get your employer (if you work as a programmer) or your
|
||||
school, if any, to sign a "copyright disclaimer" for the program, if
|
||||
necessary. Here is a sample; alter the names:
|
||||
|
||||
Yoyodyne, Inc., hereby disclaims all copyright interest in the program
|
||||
`Gnomovision' (which makes passes at compilers) written by James Hacker.
|
||||
|
||||
{signature of Ty Coon}, 1 April 1989
|
||||
Ty Coon, President of Vice
|
||||
|
||||
This General Public License does not permit incorporating your program into
|
||||
proprietary programs. If your program is a subroutine library, you may
|
||||
consider it more useful to permit linking proprietary applications with the
|
||||
library. If this is what you want to do, use the GNU Lesser General
|
||||
Public License instead of this License.
|
@ -0,0 +1,173 @@
|
||||
# client for archive.opensensemap.org
|
||||
# in this archive, CSV files for measurements of each sensor per day is provided.
|
||||
|
||||
default_archive_url = 'https://archive.opensensemap.org/'
|
||||
|
||||
#' Returns the default endpoint for the archive *download*
|
||||
#' While the front end domain is archive.opensensemap.org, file downloads
|
||||
#' are provided via sciebo.
|
||||
osem_archive_endpoint = function () default_archive_url
|
||||
|
||||
#' Fetch day-wise measurements for a single box from the openSenseMap archive.
|
||||
#'
|
||||
#' This function is significantly faster than \code{\link{osem_measurements}} for large
|
||||
#' time-frames, as daily CSV dumps for each sensor from
|
||||
#' \href{https://archive.opensensemap.org}{archive.opensensemap.org} are used.
|
||||
#' Note that the latest data available is from the previous day.
|
||||
#'
|
||||
#' By default, data for all sensors of a box is fetched, but you can select a
|
||||
#' subset with a \code{\link[dplyr]{dplyr}}-style NSE filter expression.
|
||||
#'
|
||||
#' The function will warn when no data is available in the selected period,
|
||||
#' but continue the remaining download.
|
||||
#'
|
||||
#' @param x A `sensebox data.frame` of a single box, as retrieved via \code{\link{osem_box}},
|
||||
#' to download measurements for.
|
||||
#' @param ... see parameters below
|
||||
#' @param fromDate Start date for measurement download, must be convertable via `as.Date`.
|
||||
#' @param toDate End date for measurement download (inclusive).
|
||||
#' @param sensorFilter A NSE formula matching to \code{x$sensors}, selecting a subset of sensors.
|
||||
#' @param progress Whether to print download progress information, defaults to \code{TRUE}.
|
||||
#' @return A \code{tbl_df} containing observations of all selected sensors for each time stamp.
|
||||
#'
|
||||
#' @seealso \href{https://archive.opensensemap.org}{openSenseMap archive}
|
||||
#' @seealso \code{\link{osem_measurements}}
|
||||
#' @seealso \code{\link{osem_box}}
|
||||
#'
|
||||
#' @export
|
||||
osem_measurements_archive = function (x, ...) UseMethod('osem_measurements_archive')
|
||||
|
||||
#' @export
|
||||
osem_measurements_archive.default = function (x, ...) {
|
||||
# NOTE: to implement for a different class:
|
||||
# in order to call `archive_fetch_measurements()`, `box` must be a dataframe
|
||||
# with a single row and the columns `X_id` and `name`
|
||||
stop(paste('not implemented for class', toString(class(x))))
|
||||
}
|
||||
|
||||
|
||||
# ==============================================================================
|
||||
#
|
||||
#' @describeIn osem_measurements_archive Get daywise measurements for one or more sensors of a single box.
|
||||
#' @export
|
||||
#' @examples
|
||||
#' \donttest{
|
||||
#' # fetch measurements for a single day
|
||||
#' box = osem_box('593bcd656ccf3b0011791f5a')
|
||||
#' m = osem_measurements_archive(box, as.POSIXlt('2018-09-13'))
|
||||
#'
|
||||
#' # fetch measurements for a date range and selected sensors
|
||||
#' sensors = ~ phenomenon %in% c('Temperatur', 'Beleuchtungsstärke')
|
||||
#' m = osem_measurements_archive(
|
||||
#' box,
|
||||
#' as.POSIXlt('2018-09-01'), as.POSIXlt('2018-09-30'),
|
||||
#' sensorFilter = sensors
|
||||
#' )
|
||||
#' }
|
||||
osem_measurements_archive.sensebox = function (x, fromDate, toDate = fromDate, sensorFilter = ~ TRUE, ..., progress = TRUE) {
|
||||
if (nrow(x) != 1)
|
||||
stop('this function only works for exactly one senseBox!')
|
||||
|
||||
# filter sensors using NSE, for example: `~ phenomenon == 'Temperatur'`
|
||||
sensors = x$sensors[[1]] %>%
|
||||
dplyr::filter(lazyeval::f_eval(sensorFilter, .))
|
||||
|
||||
# fetch each sensor separately
|
||||
dfs = by(sensors, 1:nrow(sensors), function (sensor) {
|
||||
df = archive_fetch_measurements(x, sensor$id, fromDate, toDate, progress) %>%
|
||||
dplyr::select(createdAt, value) %>%
|
||||
#dplyr::mutate(unit = sensor$unit, sensor = sensor$sensor) %>% # inject sensor metadata
|
||||
dplyr::rename_at(., 'value', function(v) sensor$phenomenon)
|
||||
})
|
||||
|
||||
# merge all data.frames by timestamp
|
||||
dfs %>% purrr::reduce(dplyr::full_join, 'createdAt')
|
||||
}
|
||||
|
||||
#' fetch measurements from archive from a single box, and a single sensor
|
||||
#'
|
||||
#' @param box A sensebox data.frame with a single box
|
||||
#' @param sensorId Character specifying the sensor
|
||||
#' @param fromDate Start date for measurement download, must be convertable via `as.Date`.
|
||||
#' @param toDate End date for measurement download (inclusive).
|
||||
#' @param progress whether to print progress
|
||||
#' @return A \code{tbl_df} containing observations of all selected sensors for each time stamp.
|
||||
archive_fetch_measurements = function (box, sensorId, fromDate, toDate, progress) {
|
||||
osem_ensure_archive_available()
|
||||
|
||||
dates = list()
|
||||
from = fromDate
|
||||
while (from <= toDate) {
|
||||
dates = append(dates, list(from))
|
||||
from = from + as.difftime(1, units = 'days')
|
||||
}
|
||||
|
||||
http_handle = httr::handle(osem_archive_endpoint()) # reuse the http connection for speed!
|
||||
progress = if (progress && !is_non_interactive()) httr::progress() else NULL
|
||||
|
||||
measurements = lapply(dates, function(date) {
|
||||
url = build_archive_url(date, box, sensorId)
|
||||
res = httr::GET(url, progress, handle = http_handle)
|
||||
|
||||
if (httr::http_error(res)) {
|
||||
warning(paste(
|
||||
httr::status_code(res),
|
||||
'on day', format.Date(date, '%F'),
|
||||
'for sensor', sensorId
|
||||
))
|
||||
|
||||
if (httr::status_code(res) == 404)
|
||||
return(data.frame(createdAt = as.POSIXlt(x = integer(0), origin = date), value = double()))
|
||||
}
|
||||
|
||||
measurements = httr::content(res, type = 'text', encoding = 'UTF-8') %>%
|
||||
parse_measurement_csv
|
||||
})
|
||||
|
||||
measurements %>% dplyr::bind_rows()
|
||||
}
|
||||
|
||||
#' returns URL to fetch measurements from a sensor for a specific date,
|
||||
#' based on `osem_archive_endpoint()`
|
||||
#' @noRd
|
||||
build_archive_url = function (date, box, sensorId) {
|
||||
d = format.Date(date, '%F')
|
||||
format = 'csv'
|
||||
|
||||
paste(
|
||||
osem_archive_endpoint(),
|
||||
d,
|
||||
osem_box_to_archivename(box),
|
||||
paste(paste(sensorId, d, sep = '-'), format, sep = '.'),
|
||||
sep = '/'
|
||||
)
|
||||
}
|
||||
|
||||
#' replace chars in box name according to archive script:
|
||||
#' https://github.com/sensebox/osem-archiver/blob/612e14b/helpers.sh#L66
|
||||
#'
|
||||
#' @param box A sensebox data.frame
|
||||
#' @return character with archive identifier for each box
|
||||
osem_box_to_archivename = function (box) {
|
||||
name = gsub('[^A-Za-z0-9._-]', '_', box$name)
|
||||
paste(box$X_id, name, sep = '-')
|
||||
}
|
||||
|
||||
#' Check if the given openSenseMap archive endpoint is available
|
||||
#' @param endpoint The archive base URL to check, defaulting to \code{\link{osem_archive_endpoint}}
|
||||
#' @return \code{TRUE} if the archive is available, otherwise \code{stop()} is called.
|
||||
osem_ensure_archive_available = function(endpoint = osem_archive_endpoint()) {
|
||||
code = FALSE
|
||||
try({
|
||||
code = httr::status_code(httr::GET(endpoint))
|
||||
}, silent = TRUE)
|
||||
|
||||
if (code == 200)
|
||||
return(TRUE)
|
||||
|
||||
errtext = paste('The archive at', endpoint, 'is currently not available.')
|
||||
if (code != FALSE)
|
||||
errtext = paste0(errtext, ' (HTTP code ', code, ')')
|
||||
stop(paste(errtext, collapse='\n '), call. = FALSE)
|
||||
FALSE
|
||||
}
|
@ -0,0 +1,126 @@
|
||||
# helpers for the dplyr & co related functions
|
||||
# also delayed method registration
|
||||
#
|
||||
# Methods for external generics (except when from `base`) should be registered,
|
||||
# but not exported: see https://github.com/klutometis/roxygen/issues/796
|
||||
# Until roxygen supports this usecase properly, we're using a different
|
||||
# workaround than suggested, copied from edzer's sf package:
|
||||
# dynamically register the methods only when the related package is loaded as well.
|
||||
|
||||
|
||||
# ====================== base generics =========================
|
||||
|
||||
#' maintains class / attributes after subsetting
|
||||
#' @noRd
|
||||
#' @export
|
||||
`[.sensebox` = function(x, i, ...) {
|
||||
s = NextMethod('[')
|
||||
mostattributes(s) = attributes(s)
|
||||
s
|
||||
}
|
||||
|
||||
#' maintains class / attributes after subsetting
|
||||
#' @noRd
|
||||
#' @export
|
||||
`[.osem_measurements` = function(x, i, ...) {
|
||||
s = NextMethod()
|
||||
mostattributes(s) = attributes(x)
|
||||
s
|
||||
}
|
||||
|
||||
|
||||
# ====================== dplyr generics =========================
|
||||
|
||||
#' Simple factory function meant to implement dplyr functions for other classes,
|
||||
#' which call an callback to attach the original class again after the fact.
|
||||
#'
|
||||
#' @param callback The function to call after the dplyr function
|
||||
#' @noRd
|
||||
dplyr_class_wrapper = function(callback) {
|
||||
function(.data, ..., .dots) callback(NextMethod())
|
||||
}
|
||||
|
||||
#' Return rows with matching conditions, while maintaining class & attributes
|
||||
#' @param .data A sensebox data.frame to filter
|
||||
#' @param .dots see corresponding function in package \code{\link{dplyr}}
|
||||
#' @param ... other arguments
|
||||
#' @seealso \code{\link[dplyr]{filter}}
|
||||
filter.sensebox = dplyr_class_wrapper(osem_as_sensebox)
|
||||
|
||||
#' Add new variables to the data, while maintaining class & attributes
|
||||
#' @param .data A sensebox data.frame to mutate
|
||||
#' @param .dots see corresponding function in package \code{\link{dplyr}}
|
||||
#' @param ... other arguments
|
||||
#' @seealso \code{\link[dplyr]{mutate}}
|
||||
mutate.sensebox = dplyr_class_wrapper(osem_as_sensebox)
|
||||
|
||||
#' Return rows with matching conditions, while maintaining class & attributes
|
||||
#' @param .data A osem_measurements data.frame to filter
|
||||
#' @param .dots see corresponding function in package \code{\link{dplyr}}
|
||||
#' @param ... other arguments
|
||||
#' @seealso \code{\link[dplyr]{filter}}
|
||||
filter.osem_measurements = dplyr_class_wrapper(osem_as_measurements)
|
||||
|
||||
#' Add new variables to the data, while maintaining class & attributes
|
||||
#' @param .data A osem_measurements data.frame to mutate
|
||||
#' @param .dots see corresponding function in package \code{\link{dplyr}}
|
||||
#' @param ... other arguments
|
||||
#' @seealso \code{\link[dplyr]{mutate}}
|
||||
mutate.osem_measurements = dplyr_class_wrapper(osem_as_measurements)
|
||||
|
||||
|
||||
# ====================== sf generics =========================
|
||||
|
||||
#' Convert a \code{sensebox} dataframe to an \code{\link[sf]{st_sf}} object.
|
||||
#'
|
||||
#' @param x The object to convert
|
||||
#' @param ... maybe more objects to convert
|
||||
#' @return The object with an st_geometry column attached.
|
||||
st_as_sf.sensebox = function (x, ...) {
|
||||
NextMethod(x, ..., coords = c('lon', 'lat'), crs = 4326)
|
||||
}
|
||||
|
||||
#' Convert a \code{osem_measurements} dataframe to an \code{\link[sf]{st_sf}} object.
|
||||
#'
|
||||
#' @param x The object to convert
|
||||
#' @param ... maybe more objects to convert
|
||||
#' @return The object with an st_geometry column attached.
|
||||
st_as_sf.osem_measurements = function (x, ...) {
|
||||
NextMethod(x, ..., coords = c('lon', 'lat'), crs = 4326)
|
||||
}
|
||||
|
||||
|
||||
# from: https://github.com/tidyverse/hms/blob/master/R/zzz.R
|
||||
# Thu Apr 19 10:53:24 CEST 2018
|
||||
register_s3_method <- function(pkg, generic, class, fun = NULL) {
|
||||
stopifnot(is.character(pkg), length(pkg) == 1)
|
||||
stopifnot(is.character(generic), length(generic) == 1)
|
||||
stopifnot(is.character(class), length(class) == 1)
|
||||
|
||||
if (is.null(fun)) {
|
||||
fun <- get(paste0(generic, ".", class), envir = parent.frame())
|
||||
} else {
|
||||
stopifnot(is.function(fun))
|
||||
}
|
||||
|
||||
if (pkg %in% loadedNamespaces()) {
|
||||
registerS3method(generic, class, fun, envir = asNamespace(pkg))
|
||||
}
|
||||
|
||||
# Always register hook in case package is later unloaded & reloaded
|
||||
setHook(
|
||||
packageEvent(pkg, "onLoad"),
|
||||
function(...) {
|
||||
registerS3method(generic, class, fun, envir = asNamespace(pkg))
|
||||
}
|
||||
)
|
||||
}
|
||||
|
||||
.onLoad = function(libname, pkgname) {
|
||||
register_s3_method('dplyr', 'filter', 'sensebox')
|
||||
register_s3_method('dplyr', 'mutate', 'sensebox')
|
||||
register_s3_method('dplyr', 'filter', 'osem_measurements')
|
||||
register_s3_method('dplyr', 'mutate', 'osem_measurements')
|
||||
register_s3_method('sf', 'st_as_sf', 'sensebox')
|
||||
register_s3_method('sf', 'st_as_sf', 'osem_measurements')
|
||||
}
|
@ -1,39 +0,0 @@
|
||||
# helpers for the dplyr & co related functions
|
||||
# also custom method registration
|
||||
|
||||
# they need to be registered, but not exported, see https://github.com/klutometis/roxygen/issues/796
|
||||
# we're using a different workaround than suggested, copied from edzer's sf package:
|
||||
# dynamically register the methods only when the related package is loaded as well.
|
||||
|
||||
# from: https://github.com/tidyverse/hms/blob/master/R/zzz.R
|
||||
# Thu Apr 19 10:53:24 CEST 2018
|
||||
register_s3_method <- function(pkg, generic, class, fun = NULL) {
|
||||
stopifnot(is.character(pkg), length(pkg) == 1)
|
||||
stopifnot(is.character(generic), length(generic) == 1)
|
||||
stopifnot(is.character(class), length(class) == 1)
|
||||
|
||||
if (is.null(fun)) {
|
||||
fun <- get(paste0(generic, ".", class), envir = parent.frame())
|
||||
} else {
|
||||
stopifnot(is.function(fun))
|
||||
}
|
||||
|
||||
if (pkg %in% loadedNamespaces()) {
|
||||
registerS3method(generic, class, fun, envir = asNamespace(pkg))
|
||||
}
|
||||
|
||||
# Always register hook in case package is later unloaded & reloaded
|
||||
setHook(
|
||||
packageEvent(pkg, "onLoad"),
|
||||
function(...) {
|
||||
registerS3method(generic, class, fun, envir = asNamespace(pkg))
|
||||
}
|
||||
)
|
||||
}
|
||||
|
||||
register_s3_method('dplyr', 'filter', 'sensebox')
|
||||
register_s3_method('dplyr', 'mutate', 'sensebox')
|
||||
register_s3_method('dplyr', 'filter', 'osem_measurements')
|
||||
register_s3_method('dplyr', 'mutate', 'osem_measurements')
|
||||
register_s3_method('sf', 'st_as_sf', 'sensebox')
|
||||
register_s3_method('sf', 'st_as_sf', 'osem_measurements')
|
File diff suppressed because one or more lines are too long
@ -0,0 +1,159 @@
|
||||
## ----setup, results='hide', message=FALSE, warning=FALSE----------------------
|
||||
# required packages:
|
||||
library(opensensmapr) # data download
|
||||
library(dplyr) # data wrangling
|
||||
library(ggplot2) # plotting
|
||||
library(lubridate) # date arithmetic
|
||||
library(zoo) # rollmean()
|
||||
|
||||
## ----download, results='hide', message=FALSE, warning=FALSE-------------------
|
||||
# if you want to see results for a specific subset of boxes,
|
||||
# just specify a filter such as grouptag='ifgi' here
|
||||
|
||||
# boxes = osem_boxes(cache = '.')
|
||||
boxes = readRDS('boxes_precomputed.rds') # read precomputed file to save resources
|
||||
|
||||
## -----------------------------------------------------------------------------
|
||||
boxes = filter(boxes, locationtimestamp >= "2022-01-01" & locationtimestamp <="2022-12-31")
|
||||
summary(boxes) -> summary.data.frame
|
||||
|
||||
## ---- message=FALSE, warning=FALSE--------------------------------------------
|
||||
plot(boxes)
|
||||
|
||||
## -----------------------------------------------------------------------------
|
||||
phenoms = osem_phenomena(boxes)
|
||||
str(phenoms)
|
||||
|
||||
## -----------------------------------------------------------------------------
|
||||
phenoms[phenoms > 50]
|
||||
|
||||
## ----exposure_counts, message=FALSE-------------------------------------------
|
||||
exposure_counts = boxes %>%
|
||||
group_by(exposure) %>%
|
||||
mutate(count = row_number(locationtimestamp))
|
||||
|
||||
exposure_colors = c(indoor = 'red', outdoor = 'lightgreen', mobile = 'blue', unknown = 'darkgrey')
|
||||
ggplot(exposure_counts, aes(x = locationtimestamp, y = count, colour = exposure)) +
|
||||
geom_line() +
|
||||
scale_colour_manual(values = exposure_colors) +
|
||||
xlab('Registration Date') + ylab('senseBox count')
|
||||
|
||||
## ----exposure_summary---------------------------------------------------------
|
||||
exposure_counts %>%
|
||||
summarise(
|
||||
oldest = min(locationtimestamp),
|
||||
newest = max(locationtimestamp),
|
||||
count = max(count)
|
||||
) %>%
|
||||
arrange(desc(count))
|
||||
|
||||
## ----grouptag_counts, message=FALSE-------------------------------------------
|
||||
grouptag_counts = boxes %>%
|
||||
group_by(grouptag) %>%
|
||||
# only include grouptags with 15 or more members
|
||||
filter(length(grouptag) >= 15 & !is.na(grouptag) & grouptag != '') %>%
|
||||
mutate(count = row_number(locationtimestamp))
|
||||
|
||||
# helper for sorting the grouptags by boxcount
|
||||
sortLvls = function(oldFactor, ascending = TRUE) {
|
||||
lvls = table(oldFactor) %>% sort(., decreasing = !ascending) %>% names()
|
||||
factor(oldFactor, levels = lvls)
|
||||
}
|
||||
grouptag_counts$grouptag = sortLvls(grouptag_counts$grouptag, ascending = FALSE)
|
||||
|
||||
ggplot(grouptag_counts, aes(x = locationtimestamp, y = count, colour = grouptag)) +
|
||||
geom_line(aes(group = grouptag)) +
|
||||
xlab('Registration Date') + ylab('senseBox count')
|
||||
|
||||
## ----grouptag_summary---------------------------------------------------------
|
||||
grouptag_counts %>%
|
||||
summarise(
|
||||
oldest = min(locationtimestamp),
|
||||
newest = max(locationtimestamp),
|
||||
count = max(count)
|
||||
) %>%
|
||||
arrange(desc(count))
|
||||
|
||||
## ----growthrate_registered, warning=FALSE, message=FALSE, results='hide'------
|
||||
bins = 'week'
|
||||
mvavg_bins = 6
|
||||
|
||||
growth = boxes %>%
|
||||
mutate(week = cut(as.Date(locationtimestamp), breaks = bins)) %>%
|
||||
group_by(week) %>%
|
||||
summarize(count = length(week)) %>%
|
||||
mutate(event = 'registered')
|
||||
|
||||
## ----growthrate_inactive, warning=FALSE, message=FALSE, results='hide'--------
|
||||
inactive = boxes %>%
|
||||
# remove boxes that were updated in the last two days,
|
||||
# b/c any box becomes inactive at some point by definition of updatedAt
|
||||
filter(lastMeasurement < now() - days(2)) %>%
|
||||
mutate(week = cut(as.Date(lastMeasurement), breaks = bins)) %>%
|
||||
filter(as.Date(week) > as.Date("2021-12-31")) %>%
|
||||
group_by(week) %>%
|
||||
summarize(count = length(week)) %>%
|
||||
mutate(event = 'inactive')
|
||||
|
||||
## ----growthrate, warning=FALSE, message=FALSE, results='hide'-----------------
|
||||
boxes_by_date = bind_rows(growth, inactive) %>% group_by(event)
|
||||
|
||||
ggplot(boxes_by_date, aes(x = as.Date(week), colour = event)) +
|
||||
xlab('Time') + ylab(paste('rate per ', bins)) +
|
||||
scale_x_date(date_breaks="years", date_labels="%Y") +
|
||||
scale_colour_manual(values = c(registered = 'lightgreen', inactive = 'grey')) +
|
||||
geom_point(aes(y = count), size = 0.5) +
|
||||
# moving average, make first and last value NA (to ensure identical length of vectors)
|
||||
geom_line(aes(y = rollmean(count, mvavg_bins, fill = list(NA, NULL, NA))))
|
||||
|
||||
## ----table_mostregistrations--------------------------------------------------
|
||||
boxes_by_date %>%
|
||||
filter(count > 50) %>%
|
||||
arrange(desc(count))
|
||||
|
||||
## ----exposure_duration, message=FALSE-----------------------------------------
|
||||
durations = boxes %>%
|
||||
group_by(exposure) %>%
|
||||
filter(!is.na(lastMeasurement)) %>%
|
||||
mutate(duration = difftime(lastMeasurement, locationtimestamp, units='days')) %>%
|
||||
filter(duration >= 0)
|
||||
|
||||
ggplot(durations, aes(x = exposure, y = duration)) +
|
||||
geom_boxplot() +
|
||||
coord_flip() + ylab('Duration active in Days')
|
||||
|
||||
## ----grouptag_duration, message=FALSE-----------------------------------------
|
||||
durations = boxes %>%
|
||||
filter(!is.na(lastMeasurement)) %>%
|
||||
group_by(grouptag) %>%
|
||||
# only include grouptags with 20 or more members
|
||||
filter(length(grouptag) >= 15 & !is.na(grouptag) & !is.na(lastMeasurement)) %>%
|
||||
mutate(duration = difftime(lastMeasurement, locationtimestamp, units='days')) %>%
|
||||
filter(duration >= 0)
|
||||
|
||||
ggplot(durations, aes(x = grouptag, y = duration)) +
|
||||
geom_boxplot() +
|
||||
coord_flip() + ylab('Duration active in Days')
|
||||
|
||||
durations %>%
|
||||
summarize(
|
||||
duration_avg = round(mean(duration)),
|
||||
duration_min = round(min(duration)),
|
||||
duration_max = round(max(duration)),
|
||||
oldest_box = round(max(difftime(now(), locationtimestamp, units='days')))
|
||||
) %>%
|
||||
arrange(desc(duration_avg))
|
||||
|
||||
## ----year_duration, message=FALSE---------------------------------------------
|
||||
# NOTE: boxes older than 2016 missing due to missing updatedAt in database
|
||||
duration = boxes %>%
|
||||
mutate(year = cut(as.Date(locationtimestamp), breaks = 'year')) %>%
|
||||
group_by(year) %>%
|
||||
filter(!is.na(lastMeasurement)) %>%
|
||||
mutate(duration = difftime(lastMeasurement, locationtimestamp, units='days')) %>%
|
||||
filter(duration >= 0)
|
||||
|
||||
ggplot(duration, aes(x = substr(as.character(year), 0, 4), y = duration)) +
|
||||
geom_boxplot() +
|
||||
coord_flip() + ylab('Duration active in Days') + xlab('Year of Registration')
|
||||
|
@ -0,0 +1,297 @@
|
||||
---
|
||||
title: "Visualising the Development of openSenseMap.org in 2022"
|
||||
author: "Jan Stenkamp"
|
||||
date: '`r Sys.Date()`'
|
||||
output:
|
||||
html_document:
|
||||
code_folding: hide
|
||||
df_print: kable
|
||||
theme: lumen
|
||||
toc: yes
|
||||
toc_float: yes
|
||||
rmarkdown::html_vignette:
|
||||
df_print: kable
|
||||
fig_height: 5
|
||||
fig_width: 7
|
||||
toc: yes
|
||||
vignette: >
|
||||
%\VignetteIndexEntry{Visualising the Development of openSenseMap.org in 2022}
|
||||
%\VignetteEncoding{UTF-8}
|
||||
%\VignetteEngine{knitr::rmarkdown}
|
||||
---
|
||||
|
||||
> This vignette serves as an example on data wrangling & visualization with
|
||||
`opensensmapr`, `dplyr` and `ggplot2`.
|
||||
|
||||
```{r setup, results='hide', message=FALSE, warning=FALSE}
|
||||
# required packages:
|
||||
library(opensensmapr) # data download
|
||||
library(dplyr) # data wrangling
|
||||
library(ggplot2) # plotting
|
||||
library(lubridate) # date arithmetic
|
||||
library(zoo) # rollmean()
|
||||
```
|
||||
|
||||
openSenseMap.org has grown quite a bit in the last years; it would be interesting
|
||||
to see how we got to the current `r osem_counts()$boxes` sensor stations,
|
||||
split up by various attributes of the boxes.
|
||||
|
||||
While `opensensmapr` provides extensive methods of filtering boxes by attributes
|
||||
on the server, we do the filtering within R to save time and gain flexibility.
|
||||
|
||||
|
||||
So the first step is to retrieve *all the boxes*.
|
||||
|
||||
```{r download, results='hide', message=FALSE, warning=FALSE}
|
||||
# if you want to see results for a specific subset of boxes,
|
||||
# just specify a filter such as grouptag='ifgi' here
|
||||
|
||||
# boxes = osem_boxes(cache = '.')
|
||||
boxes = readRDS('boxes_precomputed.rds') # read precomputed file to save resources
|
||||
```
|
||||
# Introduction
|
||||
In the following we just want to have a look at the boxes created in 2022, so we filter for them.
|
||||
|
||||
```{r}
|
||||
boxes = filter(boxes, locationtimestamp >= "2022-01-01" & locationtimestamp <="2022-12-31")
|
||||
summary(boxes) -> summary.data.frame
|
||||
```
|
||||
|
||||
<!-- This gives a good overview already: As of writing this, there are more than 11,000 -->
|
||||
<!-- sensor stations, of which ~30% are currently running. Most of them are placed -->
|
||||
<!-- outdoors and have around 5 sensors each. -->
|
||||
<!-- The oldest station is from August 2016, while the latest station was registered a -->
|
||||
<!-- couple of minutes ago. -->
|
||||
|
||||
Another feature of interest is the spatial distribution of the boxes: `plot()`
|
||||
can help us out here. This function requires a bunch of optional dependencies though.
|
||||
|
||||
```{r, message=FALSE, warning=FALSE}
|
||||
plot(boxes)
|
||||
```
|
||||
|
||||
But what do these sensor stations actually measure? Lets find out.
|
||||
`osem_phenomena()` gives us a named list of of the counts of each observed
|
||||
phenomenon for the given set of sensor stations:
|
||||
|
||||
```{r}
|
||||
phenoms = osem_phenomena(boxes)
|
||||
str(phenoms)
|
||||
```
|
||||
|
||||
Thats quite some noise there, with many phenomena being measured by a single
|
||||
sensor only, or many duplicated phenomena due to slightly different spellings.
|
||||
We should clean that up, but for now let's just filter out the noise and find
|
||||
those phenomena with high sensor numbers:
|
||||
|
||||
```{r}
|
||||
phenoms[phenoms > 50]
|
||||
```
|
||||
|
||||
|
||||
# Plot count of boxes by time {.tabset}
|
||||
By looking at the `createdAt` attribute of each box we know the exact time a box
|
||||
was registered. Because of some database migration issues the `createdAt` values are mostly wrong (~80% of boxes created 2022-03-30), so we are using the `timestamp` attribute of the `currentlocation` which should in most cases correspond to the creation date.
|
||||
|
||||
With this approach we have no information about boxes that were deleted in the
|
||||
meantime, but that's okay for now.
|
||||
|
||||
## ...and exposure
|
||||
```{r exposure_counts, message=FALSE}
|
||||
exposure_counts = boxes %>%
|
||||
group_by(exposure) %>%
|
||||
mutate(count = row_number(locationtimestamp))
|
||||
|
||||
exposure_colors = c(indoor = 'red', outdoor = 'lightgreen', mobile = 'blue', unknown = 'darkgrey')
|
||||
ggplot(exposure_counts, aes(x = locationtimestamp, y = count, colour = exposure)) +
|
||||
geom_line() +
|
||||
scale_colour_manual(values = exposure_colors) +
|
||||
xlab('Registration Date') + ylab('senseBox count')
|
||||
```
|
||||
|
||||
Outdoor boxes are growing *fast*!
|
||||
We can also see the introduction of `mobile` sensor "stations" in 2017.
|
||||
|
||||
Let's have a quick summary:
|
||||
```{r exposure_summary}
|
||||
exposure_counts %>%
|
||||
summarise(
|
||||
oldest = min(locationtimestamp),
|
||||
newest = max(locationtimestamp),
|
||||
count = max(count)
|
||||
) %>%
|
||||
arrange(desc(count))
|
||||
```
|
||||
|
||||
## ...and grouptag
|
||||
We can try to find out where the increases in growth came from, by analysing the
|
||||
box count by grouptag.
|
||||
|
||||
Caveats: Only a small subset of boxes has a grouptag, and we should assume
|
||||
that these groups are actually bigger. Also, we can see that grouptag naming is
|
||||
inconsistent (`Luftdaten`, `luftdaten.info`, ...)
|
||||
|
||||
```{r grouptag_counts, message=FALSE}
|
||||
grouptag_counts = boxes %>%
|
||||
group_by(grouptag) %>%
|
||||
# only include grouptags with 15 or more members
|
||||
filter(length(grouptag) >= 15 & !is.na(grouptag) & grouptag != '') %>%
|
||||
mutate(count = row_number(locationtimestamp))
|
||||
|
||||
# helper for sorting the grouptags by boxcount
|
||||
sortLvls = function(oldFactor, ascending = TRUE) {
|
||||
lvls = table(oldFactor) %>% sort(., decreasing = !ascending) %>% names()
|
||||
factor(oldFactor, levels = lvls)
|
||||
}
|
||||
grouptag_counts$grouptag = sortLvls(grouptag_counts$grouptag, ascending = FALSE)
|
||||
|
||||
ggplot(grouptag_counts, aes(x = locationtimestamp, y = count, colour = grouptag)) +
|
||||
geom_line(aes(group = grouptag)) +
|
||||
xlab('Registration Date') + ylab('senseBox count')
|
||||
```
|
||||
|
||||
```{r grouptag_summary}
|
||||
grouptag_counts %>%
|
||||
summarise(
|
||||
oldest = min(locationtimestamp),
|
||||
newest = max(locationtimestamp),
|
||||
count = max(count)
|
||||
) %>%
|
||||
arrange(desc(count))
|
||||
```
|
||||
|
||||
# Plot rate of growth and inactivity per week
|
||||
First we group the boxes by `locationtimestamp` into bins of one week:
|
||||
```{r growthrate_registered, warning=FALSE, message=FALSE, results='hide'}
|
||||
bins = 'week'
|
||||
mvavg_bins = 6
|
||||
|
||||
growth = boxes %>%
|
||||
mutate(week = cut(as.Date(locationtimestamp), breaks = bins)) %>%
|
||||
group_by(week) %>%
|
||||
summarize(count = length(week)) %>%
|
||||
mutate(event = 'registered')
|
||||
```
|
||||
|
||||
We can do the same for `updatedAt`, which informs us about the last change to
|
||||
a box, including uploaded measurements. As a lot of boxes were "updated" by the database
|
||||
migration, many of them are updated at 2022-03-30, so we try to use the `lastMeasurement`
|
||||
attribute instead of `updatedAt`. This leads to fewer boxes but also automatically excludes
|
||||
boxes which were created but never made a measurement.
|
||||
|
||||
This method of determining inactive boxes is fairly inaccurate and should be
|
||||
considered an approximation, because we have no information about intermediate
|
||||
inactive phases.
|
||||
Also deleted boxes would probably have a big impact here.
|
||||
```{r growthrate_inactive, warning=FALSE, message=FALSE, results='hide'}
|
||||
inactive = boxes %>%
|
||||
# remove boxes that were updated in the last two days,
|
||||
# b/c any box becomes inactive at some point by definition of updatedAt
|
||||
filter(lastMeasurement < now() - days(2)) %>%
|
||||
mutate(week = cut(as.Date(lastMeasurement), breaks = bins)) %>%
|
||||
filter(as.Date(week) > as.Date("2021-12-31")) %>%
|
||||
group_by(week) %>%
|
||||
summarize(count = length(week)) %>%
|
||||
mutate(event = 'inactive')
|
||||
```
|
||||
|
||||
Now we can combine both datasets for plotting:
|
||||
```{r growthrate, warning=FALSE, message=FALSE, results='hide'}
|
||||
boxes_by_date = bind_rows(growth, inactive) %>% group_by(event)
|
||||
|
||||
ggplot(boxes_by_date, aes(x = as.Date(week), colour = event)) +
|
||||
xlab('Time') + ylab(paste('rate per ', bins)) +
|
||||
scale_x_date(date_breaks="years", date_labels="%Y") +
|
||||
scale_colour_manual(values = c(registered = 'lightgreen', inactive = 'grey')) +
|
||||
geom_point(aes(y = count), size = 0.5) +
|
||||
# moving average, make first and last value NA (to ensure identical length of vectors)
|
||||
geom_line(aes(y = rollmean(count, mvavg_bins, fill = list(NA, NULL, NA))))
|
||||
```
|
||||
|
||||
And see in which weeks the most boxes become (in)active:
|
||||
```{r table_mostregistrations}
|
||||
boxes_by_date %>%
|
||||
filter(count > 50) %>%
|
||||
arrange(desc(count))
|
||||
```
|
||||
|
||||
# Plot duration of boxes being active {.tabset}
|
||||
While we are looking at `locationtimestamp` and `lastMeasurement`, we can also extract the duration of activity
|
||||
of each box, and look at metrics by exposure and grouptag once more:
|
||||
|
||||
## ...by exposure
|
||||
```{r exposure_duration, message=FALSE}
|
||||
durations = boxes %>%
|
||||
group_by(exposure) %>%
|
||||
filter(!is.na(lastMeasurement)) %>%
|
||||
mutate(duration = difftime(lastMeasurement, locationtimestamp, units='days')) %>%
|
||||
filter(duration >= 0)
|
||||
|
||||
ggplot(durations, aes(x = exposure, y = duration)) +
|
||||
geom_boxplot() +
|
||||
coord_flip() + ylab('Duration active in Days')
|
||||
```
|
||||
|
||||
The time of activity averages at only `r round(mean(durations$duration))` days,
|
||||
though there are boxes with `r round(max(durations$duration))` days of activity,
|
||||
spanning a large chunk of openSenseMap's existence.
|
||||
|
||||
## ...by grouptag
|
||||
```{r grouptag_duration, message=FALSE}
|
||||
durations = boxes %>%
|
||||
filter(!is.na(lastMeasurement)) %>%
|
||||
group_by(grouptag) %>%
|
||||
# only include grouptags with 20 or more members
|
||||
filter(length(grouptag) >= 15 & !is.na(grouptag) & !is.na(lastMeasurement)) %>%
|
||||
mutate(duration = difftime(lastMeasurement, locationtimestamp, units='days')) %>%
|
||||
filter(duration >= 0)
|
||||
|
||||
ggplot(durations, aes(x = grouptag, y = duration)) +
|
||||
geom_boxplot() +
|
||||
coord_flip() + ylab('Duration active in Days')
|
||||
|
||||
durations %>%
|
||||
summarize(
|
||||
duration_avg = round(mean(duration)),
|
||||
duration_min = round(min(duration)),
|
||||
duration_max = round(max(duration)),
|
||||
oldest_box = round(max(difftime(now(), locationtimestamp, units='days')))
|
||||
) %>%
|
||||
arrange(desc(duration_avg))
|
||||
```
|
||||
|
||||
The time of activity averages at only `r round(mean(durations$duration))` days,
|
||||
though there are boxes with `r round(max(durations$duration))` days of activity,
|
||||
spanning a large chunk of openSenseMap's existence.
|
||||
|
||||
## ...by year of registration
|
||||
This is less useful, as older boxes are active for a longer time by definition.
|
||||
If you have an idea how to compensate for that, please send a [Pull Request][PR]!
|
||||
|
||||
```{r year_duration, message=FALSE}
|
||||
# NOTE: boxes older than 2016 missing due to missing updatedAt in database
|
||||
duration = boxes %>%
|
||||
mutate(year = cut(as.Date(locationtimestamp), breaks = 'year')) %>%
|
||||
group_by(year) %>%
|
||||
filter(!is.na(lastMeasurement)) %>%
|
||||
mutate(duration = difftime(lastMeasurement, locationtimestamp, units='days')) %>%
|
||||
filter(duration >= 0)
|
||||
|
||||
ggplot(duration, aes(x = substr(as.character(year), 0, 4), y = duration)) +
|
||||
geom_boxplot() +
|
||||
coord_flip() + ylab('Duration active in Days') + xlab('Year of Registration')
|
||||
```
|
||||
|
||||
# More Visualisations
|
||||
Other visualisations come to mind, and are left as an exercise to the reader.
|
||||
If you implemented some, feel free to add them to this vignette via a [Pull Request][PR].
|
||||
|
||||
* growth by phenomenon
|
||||
* growth by location -> (interactive) map
|
||||
* set inactive rate in relation to total box count
|
||||
* filter timespans with big dips in growth rate, and extrapolate the amount of
|
||||
senseBoxes that could be on the platform today, assuming there were no production issues ;)
|
||||
|
||||
[PR]: https://github.com/sensebox/opensensmapr/pulls
|
||||
|
||||
|
File diff suppressed because one or more lines are too long
@ -1,73 +1,75 @@
|
||||
## ----setup, include=FALSE------------------------------------------------
|
||||
## ----setup, include=FALSE-----------------------------------------------------
|
||||
knitr::opts_chunk$set(echo = TRUE)
|
||||
|
||||
## ----results = F---------------------------------------------------------
|
||||
## ----results = FALSE----------------------------------------------------------
|
||||
library(magrittr)
|
||||
library(opensensmapr)
|
||||
|
||||
all_sensors = osem_boxes()
|
||||
# all_sensors = osem_boxes(cache = '.')
|
||||
all_sensors = readRDS('boxes_precomputed.rds') # read precomputed file to save resources
|
||||
|
||||
## ------------------------------------------------------------------------
|
||||
## -----------------------------------------------------------------------------
|
||||
summary(all_sensors)
|
||||
|
||||
## ----message=F, warning=F------------------------------------------------
|
||||
if (!require('maps')) install.packages('maps')
|
||||
if (!require('maptools')) install.packages('maptools')
|
||||
if (!require('rgeos')) install.packages('rgeos')
|
||||
|
||||
## ---- message=FALSE, warning=FALSE--------------------------------------------
|
||||
plot(all_sensors)
|
||||
|
||||
## ------------------------------------------------------------------------
|
||||
## -----------------------------------------------------------------------------
|
||||
phenoms = osem_phenomena(all_sensors)
|
||||
str(phenoms)
|
||||
|
||||
## ------------------------------------------------------------------------
|
||||
## -----------------------------------------------------------------------------
|
||||
phenoms[phenoms > 20]
|
||||
|
||||
## ----results = F---------------------------------------------------------
|
||||
pm25_sensors = osem_boxes(
|
||||
exposure = 'outdoor',
|
||||
date = Sys.time(), # ±4 hours
|
||||
phenomenon = 'PM2.5'
|
||||
)
|
||||
## ----results = FALSE, eval=FALSE----------------------------------------------
|
||||
# pm25_sensors = osem_boxes(
|
||||
# exposure = 'outdoor',
|
||||
# date = Sys.time(), # ±4 hours
|
||||
# phenomenon = 'PM2.5'
|
||||
# )
|
||||
|
||||
## -----------------------------------------------------------------------------
|
||||
pm25_sensors = readRDS('pm25_sensors.rds') # read precomputed file to save resources
|
||||
|
||||
## ------------------------------------------------------------------------
|
||||
summary(pm25_sensors)
|
||||
plot(pm25_sensors)
|
||||
|
||||
## ------------------------------------------------------------------------
|
||||
## ---- results=FALSE, message=FALSE--------------------------------------------
|
||||
library(sf)
|
||||
library(units)
|
||||
library(lubridate)
|
||||
library(dplyr)
|
||||
|
||||
# construct a bounding box: 12 kilometers around Berlin
|
||||
berlin = st_point(c(13.4034, 52.5120)) %>%
|
||||
st_sfc(crs = 4326) %>%
|
||||
st_transform(3857) %>% # allow setting a buffer in meters
|
||||
st_buffer(set_units(12, km)) %>%
|
||||
st_transform(4326) %>% # the opensensemap expects WGS 84
|
||||
st_bbox()
|
||||
|
||||
## ----results = F---------------------------------------------------------
|
||||
pm25 = osem_measurements(
|
||||
berlin,
|
||||
phenomenon = 'PM2.5',
|
||||
from = now() - days(20), # defaults to 2 days
|
||||
to = now()
|
||||
)
|
||||
|
||||
## ----bbox, results = FALSE, eval=FALSE----------------------------------------
|
||||
# # construct a bounding box: 12 kilometers around Berlin
|
||||
# berlin = st_point(c(13.4034, 52.5120)) %>%
|
||||
# st_sfc(crs = 4326) %>%
|
||||
# st_transform(3857) %>% # allow setting a buffer in meters
|
||||
# st_buffer(set_units(12, km)) %>%
|
||||
# st_transform(4326) %>% # the opensensemap expects WGS 84
|
||||
# st_bbox()
|
||||
# pm25 = osem_measurements(
|
||||
# berlin,
|
||||
# phenomenon = 'PM2.5',
|
||||
# from = now() - days(3), # defaults to 2 days
|
||||
# to = now()
|
||||
# )
|
||||
#
|
||||
|
||||
## -----------------------------------------------------------------------------
|
||||
pm25 = readRDS('pm25_berlin.rds') # read precomputed file to save resources
|
||||
plot(pm25)
|
||||
|
||||
## ------------------------------------------------------------------------
|
||||
## ---- warning=FALSE-----------------------------------------------------------
|
||||
outliers = filter(pm25, value > 100)$sensorId
|
||||
bad_sensors = outliers[, drop = T] %>% levels()
|
||||
bad_sensors = outliers[, drop = TRUE] %>% levels()
|
||||
|
||||
pm25 = mutate(pm25, invalid = sensorId %in% bad_sensors)
|
||||
|
||||
## ------------------------------------------------------------------------
|
||||
st_as_sf(pm25) %>% st_geometry() %>% plot(col = factor(pm25$invalid), axes = T)
|
||||
## -----------------------------------------------------------------------------
|
||||
st_as_sf(pm25) %>% st_geometry() %>% plot(col = factor(pm25$invalid), axes = TRUE)
|
||||
|
||||
## ------------------------------------------------------------------------
|
||||
## -----------------------------------------------------------------------------
|
||||
pm25 %>% filter(invalid == FALSE) %>% plot()
|
||||
|
||||
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@ -0,0 +1,25 @@
|
||||
% Generated by roxygen2: do not edit by hand
|
||||
% Please edit documentation in R/archive.R
|
||||
\name{archive_fetch_measurements}
|
||||
\alias{archive_fetch_measurements}
|
||||
\title{fetch measurements from archive from a single box, and a single sensor}
|
||||
\usage{
|
||||
archive_fetch_measurements(box, sensorId, fromDate, toDate, progress)
|
||||
}
|
||||
\arguments{
|
||||
\item{box}{A sensebox data.frame with a single box}
|
||||
|
||||
\item{sensorId}{Character specifying the sensor}
|
||||
|
||||
\item{fromDate}{Start date for measurement download, must be convertable via `as.Date`.}
|
||||
|
||||
\item{toDate}{End date for measurement download (inclusive).}
|
||||
|
||||
\item{progress}{whether to print progress}
|
||||
}
|
||||
\value{
|
||||
A \code{tbl_df} containing observations of all selected sensors for each time stamp.
|
||||
}
|
||||
\description{
|
||||
fetch measurements from archive from a single box, and a single sensor
|
||||
}
|
@ -0,0 +1,15 @@
|
||||
% Generated by roxygen2: do not edit by hand
|
||||
% Please edit documentation in R/archive.R
|
||||
\name{osem_archive_endpoint}
|
||||
\alias{osem_archive_endpoint}
|
||||
\title{Returns the default endpoint for the archive *download*
|
||||
While the front end domain is archive.opensensemap.org, file downloads
|
||||
are provided via sciebo.}
|
||||
\usage{
|
||||
osem_archive_endpoint()
|
||||
}
|
||||
\description{
|
||||
Returns the default endpoint for the archive *download*
|
||||
While the front end domain is archive.opensensemap.org, file downloads
|
||||
are provided via sciebo.
|
||||
}
|
@ -0,0 +1,19 @@
|
||||
% Generated by roxygen2: do not edit by hand
|
||||
% Please edit documentation in R/archive.R
|
||||
\name{osem_box_to_archivename}
|
||||
\alias{osem_box_to_archivename}
|
||||
\title{replace chars in box name according to archive script:
|
||||
https://github.com/sensebox/osem-archiver/blob/612e14b/helpers.sh#L66}
|
||||
\usage{
|
||||
osem_box_to_archivename(box)
|
||||
}
|
||||
\arguments{
|
||||
\item{box}{A sensebox data.frame}
|
||||
}
|
||||
\value{
|
||||
character with archive identifier for each box
|
||||
}
|
||||
\description{
|
||||
replace chars in box name according to archive script:
|
||||
https://github.com/sensebox/osem-archiver/blob/612e14b/helpers.sh#L66
|
||||
}
|
@ -0,0 +1,17 @@
|
||||
% Generated by roxygen2: do not edit by hand
|
||||
% Please edit documentation in R/api.R
|
||||
\name{osem_ensure_api_available}
|
||||
\alias{osem_ensure_api_available}
|
||||
\title{Check if the given openSenseMap API endpoint is available}
|
||||
\usage{
|
||||
osem_ensure_api_available(endpoint = osem_endpoint())
|
||||
}
|
||||
\arguments{
|
||||
\item{endpoint}{The API base URL to check, defaulting to \code{\link{osem_endpoint}}}
|
||||
}
|
||||
\value{
|
||||
\code{TRUE} if the API is available, otherwise \code{stop()} is called.
|
||||
}
|
||||
\description{
|
||||
Check if the given openSenseMap API endpoint is available
|
||||
}
|
@ -0,0 +1,17 @@
|
||||
% Generated by roxygen2: do not edit by hand
|
||||
% Please edit documentation in R/archive.R
|
||||
\name{osem_ensure_archive_available}
|
||||
\alias{osem_ensure_archive_available}
|
||||
\title{Check if the given openSenseMap archive endpoint is available}
|
||||
\usage{
|
||||
osem_ensure_archive_available(endpoint = osem_archive_endpoint())
|
||||
}
|
||||
\arguments{
|
||||
\item{endpoint}{The archive base URL to check, defaulting to \code{\link{osem_archive_endpoint}}}
|
||||
}
|
||||
\value{
|
||||
\code{TRUE} if the archive is available, otherwise \code{stop()} is called.
|
||||
}
|
||||
\description{
|
||||
Check if the given openSenseMap archive endpoint is available
|
||||
}
|
@ -0,0 +1,75 @@
|
||||
% Generated by roxygen2: do not edit by hand
|
||||
% Please edit documentation in R/archive.R
|
||||
\name{osem_measurements_archive}
|
||||
\alias{osem_measurements_archive}
|
||||
\alias{osem_measurements_archive.sensebox}
|
||||
\title{Fetch day-wise measurements for a single box from the openSenseMap archive.}
|
||||
\usage{
|
||||
osem_measurements_archive(x, ...)
|
||||
|
||||
\method{osem_measurements_archive}{sensebox}(
|
||||
x,
|
||||
fromDate,
|
||||
toDate = fromDate,
|
||||
sensorFilter = ~TRUE,
|
||||
...,
|
||||
progress = TRUE
|
||||
)
|
||||
}
|
||||
\arguments{
|
||||
\item{x}{A `sensebox data.frame` of a single box, as retrieved via \code{\link{osem_box}},
|
||||
to download measurements for.}
|
||||
|
||||
\item{...}{see parameters below}
|
||||
|
||||
\item{fromDate}{Start date for measurement download, must be convertable via `as.Date`.}
|
||||
|
||||
\item{toDate}{End date for measurement download (inclusive).}
|
||||
|
||||
\item{sensorFilter}{A NSE formula matching to \code{x$sensors}, selecting a subset of sensors.}
|
||||
|
||||
\item{progress}{Whether to print download progress information, defaults to \code{TRUE}.}
|
||||
}
|
||||
\value{
|
||||
A \code{tbl_df} containing observations of all selected sensors for each time stamp.
|
||||
}
|
||||
\description{
|
||||
This function is significantly faster than \code{\link{osem_measurements}} for large
|
||||
time-frames, as daily CSV dumps for each sensor from
|
||||
\href{https://archive.opensensemap.org}{archive.opensensemap.org} are used.
|
||||
Note that the latest data available is from the previous day.
|
||||
}
|
||||
\details{
|
||||
By default, data for all sensors of a box is fetched, but you can select a
|
||||
subset with a \code{\link[dplyr]{dplyr}}-style NSE filter expression.
|
||||
|
||||
The function will warn when no data is available in the selected period,
|
||||
but continue the remaining download.
|
||||
}
|
||||
\section{Methods (by class)}{
|
||||
\itemize{
|
||||
\item \code{osem_measurements_archive(sensebox)}: Get daywise measurements for one or more sensors of a single box.
|
||||
|
||||
}}
|
||||
\examples{
|
||||
\donttest{
|
||||
# fetch measurements for a single day
|
||||
box = osem_box('593bcd656ccf3b0011791f5a')
|
||||
m = osem_measurements_archive(box, as.POSIXlt('2018-09-13'))
|
||||
|
||||
# fetch measurements for a date range and selected sensors
|
||||
sensors = ~ phenomenon \%in\% c('Temperatur', 'Beleuchtungsstärke')
|
||||
m = osem_measurements_archive(
|
||||
box,
|
||||
as.POSIXlt('2018-09-01'), as.POSIXlt('2018-09-30'),
|
||||
sensorFilter = sensors
|
||||
)
|
||||
}
|
||||
}
|
||||
\seealso{
|
||||
\href{https://archive.opensensemap.org}{openSenseMap archive}
|
||||
|
||||
\code{\link{osem_measurements}}
|
||||
|
||||
\code{\link{osem_box}}
|
||||
}
|
@ -0,0 +1,7 @@
|
||||
context('API error handling')
|
||||
|
||||
test_that('unavailable API yields informative error message', {
|
||||
expect_error({
|
||||
osem_boxes(endpoint = 'example.zip')
|
||||
}, 'The API at example.zip is currently not available')
|
||||
})
|
@ -0,0 +1,66 @@
|
||||
source('testhelpers.R')
|
||||
|
||||
context('osem_box_to_archivename()')
|
||||
|
||||
try({
|
||||
boxes = osem_boxes(grouptag = 'ifgi')
|
||||
box = osem_box('593bcd656ccf3b0011791f5a')
|
||||
})
|
||||
|
||||
test_that('osem_box_to_archive_name does the correct character replacements', {
|
||||
b = data.frame(
|
||||
name = 'aA1._- äß!"?$%&/',
|
||||
X_id = 'UUID'
|
||||
)
|
||||
|
||||
archivename = opensensmapr:::osem_box_to_archivename(b)
|
||||
expect_equal(archivename, 'UUID-aA1._-__________')
|
||||
})
|
||||
|
||||
test_that('osem_box_to_archive_name works for one box', {
|
||||
check_api()
|
||||
if (is.null(box)) skip('no box data could be fetched')
|
||||
|
||||
archivename = opensensmapr:::osem_box_to_archivename(box)
|
||||
expect_length(archivename, 1)
|
||||
expect_type(archivename, 'character')
|
||||
})
|
||||
|
||||
test_that('osem_box_to_archive_name works for multiple boxes', {
|
||||
check_api()
|
||||
if (is.null(boxes)) skip('no box data available')
|
||||
|
||||
archivename = opensensmapr:::osem_box_to_archivename(boxes)
|
||||
expect_length(archivename, nrow(boxes))
|
||||
expect_type(archivename, 'character')
|
||||
})
|
||||
|
||||
context('osem_measurements_archive()')
|
||||
|
||||
test_that('osem_measurements_archive works for one box', {
|
||||
check_api()
|
||||
if (is.null(box)) skip('no box data could be fetched')
|
||||
|
||||
m = osem_measurements_archive(box, as.POSIXlt('2018-08-08'))
|
||||
expect_length(m, nrow(box$sensors[[1]]) + 1) # one column for each sensor + createdAt
|
||||
expect_s3_class(m, c('data.frame'))
|
||||
})
|
||||
|
||||
test_that('osem_measurements_archive sensorFilter works for one box', {
|
||||
check_api()
|
||||
if (is.null(box)) skip('no box data could be fetched')
|
||||
|
||||
m = osem_measurements_archive(box, as.POSIXlt('2018-08-08'), sensorFilter = ~ phenomenon == 'Temperatur')
|
||||
expect_length(m, 2) # one column for Temperatur + createdAt
|
||||
expect_s3_class(m, c('data.frame'))
|
||||
})
|
||||
|
||||
test_that('osem_measurements_archive fails for multiple boxes', {
|
||||
check_api()
|
||||
if (is.null(boxes)) skip('no box data available')
|
||||
|
||||
expect_error(
|
||||
osem_measurements_archive(boxes, as.POSIXlt('2018-08-08')),
|
||||
'this function only works for exactly one senseBox!'
|
||||
)
|
||||
})
|
Binary file not shown.
@ -0,0 +1,297 @@
|
||||
---
|
||||
title: "Visualising the Development of openSenseMap.org in 2022"
|
||||
author: "Jan Stenkamp"
|
||||
date: '`r Sys.Date()`'
|
||||
output:
|
||||
html_document:
|
||||
code_folding: hide
|
||||
df_print: kable
|
||||
theme: lumen
|
||||
toc: yes
|
||||
toc_float: yes
|
||||
rmarkdown::html_vignette:
|
||||
df_print: kable
|
||||
fig_height: 5
|
||||
fig_width: 7
|
||||
toc: yes
|
||||
vignette: >
|
||||
%\VignetteIndexEntry{Visualising the Development of openSenseMap.org in 2022}
|
||||
%\VignetteEncoding{UTF-8}
|
||||
%\VignetteEngine{knitr::rmarkdown}
|
||||
---
|
||||
|
||||
> This vignette serves as an example on data wrangling & visualization with
|
||||
`opensensmapr`, `dplyr` and `ggplot2`.
|
||||
|
||||
```{r setup, results='hide', message=FALSE, warning=FALSE}
|
||||
# required packages:
|
||||
library(opensensmapr) # data download
|
||||
library(dplyr) # data wrangling
|
||||
library(ggplot2) # plotting
|
||||
library(lubridate) # date arithmetic
|
||||
library(zoo) # rollmean()
|
||||
```
|
||||
|
||||
openSenseMap.org has grown quite a bit in the last years; it would be interesting
|
||||
to see how we got to the current `r osem_counts()$boxes` sensor stations,
|
||||
split up by various attributes of the boxes.
|
||||
|
||||
While `opensensmapr` provides extensive methods of filtering boxes by attributes
|
||||
on the server, we do the filtering within R to save time and gain flexibility.
|
||||
|
||||
|
||||
So the first step is to retrieve *all the boxes*.
|
||||
|
||||
```{r download, results='hide', message=FALSE, warning=FALSE}
|
||||
# if you want to see results for a specific subset of boxes,
|
||||
# just specify a filter such as grouptag='ifgi' here
|
||||
|
||||
# boxes = osem_boxes(cache = '.')
|
||||
boxes = readRDS('boxes_precomputed.rds') # read precomputed file to save resources
|
||||
```
|
||||
# Introduction
|
||||
In the following we just want to have a look at the boxes created in 2022, so we filter for them.
|
||||
|
||||
```{r}
|
||||
boxes = filter(boxes, locationtimestamp >= "2022-01-01" & locationtimestamp <="2022-12-31")
|
||||
summary(boxes) -> summary.data.frame
|
||||
```
|
||||
|
||||
<!-- This gives a good overview already: As of writing this, there are more than 11,000 -->
|
||||
<!-- sensor stations, of which ~30% are currently running. Most of them are placed -->
|
||||
<!-- outdoors and have around 5 sensors each. -->
|
||||
<!-- The oldest station is from August 2016, while the latest station was registered a -->
|
||||
<!-- couple of minutes ago. -->
|
||||
|
||||
Another feature of interest is the spatial distribution of the boxes: `plot()`
|
||||
can help us out here. This function requires a bunch of optional dependencies though.
|
||||
|
||||
```{r, message=FALSE, warning=FALSE}
|
||||
plot(boxes)
|
||||
```
|
||||
|
||||
But what do these sensor stations actually measure? Lets find out.
|
||||
`osem_phenomena()` gives us a named list of of the counts of each observed
|
||||
phenomenon for the given set of sensor stations:
|
||||
|
||||
```{r}
|
||||
phenoms = osem_phenomena(boxes)
|
||||
str(phenoms)
|
||||
```
|
||||
|
||||
Thats quite some noise there, with many phenomena being measured by a single
|
||||
sensor only, or many duplicated phenomena due to slightly different spellings.
|
||||
We should clean that up, but for now let's just filter out the noise and find
|
||||
those phenomena with high sensor numbers:
|
||||
|
||||
```{r}
|
||||
phenoms[phenoms > 50]
|
||||
```
|
||||
|
||||
|
||||
# Plot count of boxes by time {.tabset}
|
||||
By looking at the `createdAt` attribute of each box we know the exact time a box
|
||||
was registered. Because of some database migration issues the `createdAt` values are mostly wrong (~80% of boxes created 2022-03-30), so we are using the `timestamp` attribute of the `currentlocation` which should in most cases correspond to the creation date.
|
||||
|
||||
With this approach we have no information about boxes that were deleted in the
|
||||
meantime, but that's okay for now.
|
||||
|
||||
## ...and exposure
|
||||
```{r exposure_counts, message=FALSE}
|
||||
exposure_counts = boxes %>%
|
||||
group_by(exposure) %>%
|
||||
mutate(count = row_number(locationtimestamp))
|
||||
|
||||
exposure_colors = c(indoor = 'red', outdoor = 'lightgreen', mobile = 'blue', unknown = 'darkgrey')
|
||||
ggplot(exposure_counts, aes(x = locationtimestamp, y = count, colour = exposure)) +
|
||||
geom_line() +
|
||||
scale_colour_manual(values = exposure_colors) +
|
||||
xlab('Registration Date') + ylab('senseBox count')
|
||||
```
|
||||
|
||||
Outdoor boxes are growing *fast*!
|
||||
We can also see the introduction of `mobile` sensor "stations" in 2017.
|
||||
|
||||
Let's have a quick summary:
|
||||
```{r exposure_summary}
|
||||
exposure_counts %>%
|
||||
summarise(
|
||||
oldest = min(locationtimestamp),
|
||||
newest = max(locationtimestamp),
|
||||
count = max(count)
|
||||
) %>%
|
||||
arrange(desc(count))
|
||||
```
|
||||
|
||||
## ...and grouptag
|
||||
We can try to find out where the increases in growth came from, by analysing the
|
||||
box count by grouptag.
|
||||
|
||||
Caveats: Only a small subset of boxes has a grouptag, and we should assume
|
||||
that these groups are actually bigger. Also, we can see that grouptag naming is
|
||||
inconsistent (`Luftdaten`, `luftdaten.info`, ...)
|
||||
|
||||
```{r grouptag_counts, message=FALSE}
|
||||
grouptag_counts = boxes %>%
|
||||
group_by(grouptag) %>%
|
||||
# only include grouptags with 15 or more members
|
||||
filter(length(grouptag) >= 15 & !is.na(grouptag) & grouptag != '') %>%
|
||||
mutate(count = row_number(locationtimestamp))
|
||||
|
||||
# helper for sorting the grouptags by boxcount
|
||||
sortLvls = function(oldFactor, ascending = TRUE) {
|
||||
lvls = table(oldFactor) %>% sort(., decreasing = !ascending) %>% names()
|
||||
factor(oldFactor, levels = lvls)
|
||||
}
|
||||
grouptag_counts$grouptag = sortLvls(grouptag_counts$grouptag, ascending = FALSE)
|
||||
|
||||
ggplot(grouptag_counts, aes(x = locationtimestamp, y = count, colour = grouptag)) +
|
||||
geom_line(aes(group = grouptag)) +
|
||||
xlab('Registration Date') + ylab('senseBox count')
|
||||
```
|
||||
|
||||
```{r grouptag_summary}
|
||||
grouptag_counts %>%
|
||||
summarise(
|
||||
oldest = min(locationtimestamp),
|
||||
newest = max(locationtimestamp),
|
||||
count = max(count)
|
||||
) %>%
|
||||
arrange(desc(count))
|
||||
```
|
||||
|
||||
# Plot rate of growth and inactivity per week
|
||||
First we group the boxes by `locationtimestamp` into bins of one week:
|
||||
```{r growthrate_registered, warning=FALSE, message=FALSE, results='hide'}
|
||||
bins = 'week'
|
||||
mvavg_bins = 6
|
||||
|
||||
growth = boxes %>%
|
||||
mutate(week = cut(as.Date(locationtimestamp), breaks = bins)) %>%
|
||||
group_by(week) %>%
|
||||
summarize(count = length(week)) %>%
|
||||
mutate(event = 'registered')
|
||||
```
|
||||
|
||||
We can do the same for `updatedAt`, which informs us about the last change to
|
||||
a box, including uploaded measurements. As a lot of boxes were "updated" by the database
|
||||
migration, many of them are updated at 2022-03-30, so we try to use the `lastMeasurement`
|
||||
attribute instead of `updatedAt`. This leads to fewer boxes but also automatically excludes
|
||||
boxes which were created but never made a measurement.
|
||||
|
||||
This method of determining inactive boxes is fairly inaccurate and should be
|
||||
considered an approximation, because we have no information about intermediate
|
||||
inactive phases.
|
||||
Also deleted boxes would probably have a big impact here.
|
||||
```{r growthrate_inactive, warning=FALSE, message=FALSE, results='hide'}
|
||||
inactive = boxes %>%
|
||||
# remove boxes that were updated in the last two days,
|
||||
# b/c any box becomes inactive at some point by definition of updatedAt
|
||||
filter(lastMeasurement < now() - days(2)) %>%
|
||||
mutate(week = cut(as.Date(lastMeasurement), breaks = bins)) %>%
|
||||
filter(as.Date(week) > as.Date("2021-12-31")) %>%
|
||||
group_by(week) %>%
|
||||
summarize(count = length(week)) %>%
|
||||
mutate(event = 'inactive')
|
||||
```
|
||||
|
||||
Now we can combine both datasets for plotting:
|
||||
```{r growthrate, warning=FALSE, message=FALSE, results='hide'}
|
||||
boxes_by_date = bind_rows(growth, inactive) %>% group_by(event)
|
||||
|
||||
ggplot(boxes_by_date, aes(x = as.Date(week), colour = event)) +
|
||||
xlab('Time') + ylab(paste('rate per ', bins)) +
|
||||
scale_x_date(date_breaks="years", date_labels="%Y") +
|
||||
scale_colour_manual(values = c(registered = 'lightgreen', inactive = 'grey')) +
|
||||
geom_point(aes(y = count), size = 0.5) +
|
||||
# moving average, make first and last value NA (to ensure identical length of vectors)
|
||||
geom_line(aes(y = rollmean(count, mvavg_bins, fill = list(NA, NULL, NA))))
|
||||
```
|
||||
|
||||
And see in which weeks the most boxes become (in)active:
|
||||
```{r table_mostregistrations}
|
||||
boxes_by_date %>%
|
||||
filter(count > 50) %>%
|
||||
arrange(desc(count))
|
||||
```
|
||||
|
||||
# Plot duration of boxes being active {.tabset}
|
||||
While we are looking at `locationtimestamp` and `lastMeasurement`, we can also extract the duration of activity
|
||||
of each box, and look at metrics by exposure and grouptag once more:
|
||||
|
||||
## ...by exposure
|
||||
```{r exposure_duration, message=FALSE}
|
||||
durations = boxes %>%
|
||||
group_by(exposure) %>%
|
||||
filter(!is.na(lastMeasurement)) %>%
|
||||
mutate(duration = difftime(lastMeasurement, locationtimestamp, units='days')) %>%
|
||||
filter(duration >= 0)
|
||||
|
||||
ggplot(durations, aes(x = exposure, y = duration)) +
|
||||
geom_boxplot() +
|
||||
coord_flip() + ylab('Duration active in Days')
|
||||
```
|
||||
|
||||
The time of activity averages at only `r round(mean(durations$duration))` days,
|
||||
though there are boxes with `r round(max(durations$duration))` days of activity,
|
||||
spanning a large chunk of openSenseMap's existence.
|
||||
|
||||
## ...by grouptag
|
||||
```{r grouptag_duration, message=FALSE}
|
||||
durations = boxes %>%
|
||||
filter(!is.na(lastMeasurement)) %>%
|
||||
group_by(grouptag) %>%
|
||||
# only include grouptags with 20 or more members
|
||||
filter(length(grouptag) >= 15 & !is.na(grouptag) & !is.na(lastMeasurement)) %>%
|
||||
mutate(duration = difftime(lastMeasurement, locationtimestamp, units='days')) %>%
|
||||
filter(duration >= 0)
|
||||
|
||||
ggplot(durations, aes(x = grouptag, y = duration)) +
|
||||
geom_boxplot() +
|
||||
coord_flip() + ylab('Duration active in Days')
|
||||
|
||||
durations %>%
|
||||
summarize(
|
||||
duration_avg = round(mean(duration)),
|
||||
duration_min = round(min(duration)),
|
||||
duration_max = round(max(duration)),
|
||||
oldest_box = round(max(difftime(now(), locationtimestamp, units='days')))
|
||||
) %>%
|
||||
arrange(desc(duration_avg))
|
||||
```
|
||||
|
||||
The time of activity averages at only `r round(mean(durations$duration))` days,
|
||||
though there are boxes with `r round(max(durations$duration))` days of activity,
|
||||
spanning a large chunk of openSenseMap's existence.
|
||||
|
||||
## ...by year of registration
|
||||
This is less useful, as older boxes are active for a longer time by definition.
|
||||
If you have an idea how to compensate for that, please send a [Pull Request][PR]!
|
||||
|
||||
```{r year_duration, message=FALSE}
|
||||
# NOTE: boxes older than 2016 missing due to missing updatedAt in database
|
||||
duration = boxes %>%
|
||||
mutate(year = cut(as.Date(locationtimestamp), breaks = 'year')) %>%
|
||||
group_by(year) %>%
|
||||
filter(!is.na(lastMeasurement)) %>%
|
||||
mutate(duration = difftime(lastMeasurement, locationtimestamp, units='days')) %>%
|
||||
filter(duration >= 0)
|
||||
|
||||
ggplot(duration, aes(x = substr(as.character(year), 0, 4), y = duration)) +
|
||||
geom_boxplot() +
|
||||
coord_flip() + ylab('Duration active in Days') + xlab('Year of Registration')
|
||||
```
|
||||
|
||||
# More Visualisations
|
||||
Other visualisations come to mind, and are left as an exercise to the reader.
|
||||
If you implemented some, feel free to add them to this vignette via a [Pull Request][PR].
|
||||
|
||||
* growth by phenomenon
|
||||
* growth by location -> (interactive) map
|
||||
* set inactive rate in relation to total box count
|
||||
* filter timespans with big dips in growth rate, and extrapolate the amount of
|
||||
senseBoxes that could be on the platform today, assuming there were no production issues ;)
|
||||
|
||||
[PR]: https://github.com/sensebox/opensensmapr/pulls
|
||||
|
||||
|
Binary file not shown.
Binary file not shown.
Loading…
Reference in New Issue