Systems Admin/Programmer III

 
  • University of Florida
  • United States
  • May 28, 2021
Technology Full Time - Continuing

Job Description:

Classification Title:

Systems Admin/Programmer III

Job Description:

Assist in maintaining existing computer, networking, and software infrastructure in ACIS laboratory. Integrate infrastructure developed as part of the iDigBio project into the overall resource offerings of the ACIS laboratory. Act as liaison between the iDigBio project's and lab's hardware and software infrastructure and the other computer services or centers at UF and elsewhere. Manage and coordinate work done by undergraduate and graduate students that affects the functioning of the iDigBio project's and lab's computing infrastructure. Document best practices and develop technical support materials for ACIS hardware and software.

Other significant activities are as a systems programmer and technology implementer for collaborative projects with scientists from application domains. The incumbent will help the team design, implement, and maintain storage, infrastructure, platform, and software clouds including software and hardware selection. Integrate external cloud and distributed data resources with resources developed as part of the projects. Maintain expertise in cloud software and hardware.

Develop software for cloud middleware and/or web portals using modern front-end web technologies and frameworks. Collect and report performance and quality metrics to ensure resources are meeting project goals. Create documentation and software packages to make work usable by other institutions. Train collaborators and end users in the cloud and software resources created.

Design, implement and support complex ETL mappings to migrate large data volumes from heterogeneous source systems into a central data store. Participate in the design of new or changing data mappings and workflows, evolving the iDigBio data model as data standards are updated and data growth need arises. Produce technical specification and documentation to effectively communicate with data providers and consumers. Develop and maintain data visualization workflows and tools to communicate with peers, leadership, and research partners.

Expected Salary:

$75,000 - $$92,000 Commensurate with qualifications and experience

Minimum Requirements:

Bachelor's degree in an appropriate area and two years of relevant experience; or a high school diploma or equivalent and six years of experience. Appropriate college coursework or vocational/technical training may substitute at an equivalent rate for the required experience.

Preferred Qualifications:

Ability to communicate with a variety of audiences (technical, management, community)

Ability to be self-motivated, identify and own the work that needs to be done

Ability to prioritize high-impact work over non-critical tasks

Commitment to write and document clean code and unit tests

Proficiency in Node.js or Python (or at least one programming language from Java, Go, Rust, Ruby, or C), SQL, and Elasticsearch Query DSL

Experience with the technologies used in the current iDigBio architecture or their close equivalents - Node.js (React, Express, Leaflet), Python, PostgreSQL, Elasticsearch, Ceph (S3-compatible API), Redis

Experience consuming and designing RESTful APIs

Experience developing software in a team environment with git and multiple committers

Experience with data interchange formats such as JSON, XML, and CSV

Experience administering one or more virtualization platforms such as Citrix XenServer, VMware, Hyper-V, Xen, KVM, etc.

Experience managing web-tier components such as NGINX, Apache HTTP Server, HAProxy, and Varnish

Experience with monitoring and alerting of production systems

Experience using configuration management and IT automation software such as Salt, Ansible, Puppet, Terraform, Uyuni, or similar.

Experience developing containerized applications in Docker or Kubernetes

Experience designing front-end user interfaces, including the use of HTML, CSS, and JavaScript to interact with backend web APIs.

Experience with Linux or other Unix-like operating systems, especially using command-line, CLI, Bash, and writing shell scripts

Experience preparing and presenting training materials

Familiarity with Event Driven Architecture, messaging, queues, microservices, and distributed systems

Familiarity with ontologies, semantic web, and RDF

Familiarity with NoSQL or document storage systems such Hadoop/HBase, Cassandra, MongoDB, CouchDB, or RethinkDB

Familiarity with blob storage systems such as S3

Familiarity with Continuous Integration / Continuous Delivery (CI/CD) and GitOps practices

Willingness to learn and work in all parts of the stack, including the network, hardware infrastructure, database, API, and front-end, and all layers in-between

Interest in writing documentation

Interest in Natural History Collections

Special Instructions to Applicants:

In order to be considered for this position, applicant must upload a cover letter and resume with application.

This is a time-limited position.

Application must be submitted by 11:55 p.m. (ET) of the posting end date.

This position has been reposted. Previous applicants are still under consideration and need not reapply.

Health Assessment Required:No