Known as work-from-anywhere (WFA), the remote work phenomenon has gained serious momentum over the past several decades with the advent of cloud-based productivity tools and exponentially increasing Internet connectivity and bandwidth.

Since many workers were forced to operate from home (or remotely) due to social distancing restrictions associated with the pandemic, the debate on which work style (remote or in-person) is superior rages on. Some companies (Apple, Google, Zoom) have continued to allow remote work, arguing that it is just as productive as in-office work. Data from Cloudbrink (Bosch, 2025) shows that WFA employees work longer, averaging close to 12 hours per day. But despite that, some leading tech employers such as Amazon have recently announced that employees must return to the office 5 days a week. Tesla’s founder, at one point, notably urged employees to return to the office at least 40 hours per week or depart the company.

Remote Teaching and Learning

So why do company philosophies seem to be so divided on this issue of in-person vs. remote work? Back to that in a minute. But first, we can look at teaching/learning as another example of how this issue over remote vs. in-person modes is fiercely challenged. Due to the pandemic and distancing mandates, many instructors were “forced” to teach online, often for the first time. Virtual (Zoom-based) class sessions became the norm, characterized by students multi-tasking with cameras turned off, limiting the ability for instructors to measure authentic engagement and participation. Additionally, asynchronous (no regularly scheduled live session) courses require instructors to be highly skilled in instructional design and arguably function more as facilitators than traditional teachers or lecturers. But just as we are not abandoning modern transportation modes like cars and planes to return to the horse and buggy, an education model that assumes students need every class held in person is unrealistic, possibly even detrimental. When commerce evolves in new ways, shouldn’t colleges?

Just as modern technology has revolutionized the world of work, it has the potential to do the same for higher education. An important question is why we must take an all-or-nothing approach to work and education. Hybrid work, which is characterized by combining in-office and remote work, seems to offer a more efficient way to approach this issue. Many companies do this already, understanding that the type of job greatly influences where the work should take place. Some work can be accomplished from home (document prep, office-related communication, data analysis, etc.), and other types of work (handling hazardous chemicals/materials, manufacturing, etc.) cannot.

Applying this same thinking to education, why can’t institutions seem to find a balance between in-person and online learning? While this is not to say that “hybrid” courses as we operate them currently are the answer, that creative balance of performing unique tasks (labs, clinical work, co-ops, internships, etc.) in specific environments must be considered. Some institutions now refer to these methods as “high-impact practices.” It’s time to start thinking strategically about adapting and applying high-impact practices from work to the educational environment. Apple Corporation’s original philosophy was to “think differently.” Maybe a new approach to education should be “learn differently, work differently.”

Reflection Questions

  1. How would you define “high impact practices,” and what examples have you observed at your institution or workplace?
  2. How can we think differently regarding the current approach to in-person vs. remote work and school?
  3. How might the future of work and education evolve as technology and AI significantly impact society?