in

Posit AI Weblog: TensorFlow and Keras 2.9



The discharge of Deep Learning with R, 2nd
Edition
coincides with new releases of
TensorFlow and Keras. These releases carry many refinements that enable
for extra idiomatic and concise R code.

First, the set of Tensor strategies for base R generics has vastly
expanded. The set of R generics that work with TensorFlow Tensors is now
fairly intensive:

methods(class = "tensorflow.tensor")
 [1] -           !           !=          [           [<-        
 [6] *           /           &           %/%         %%         
[11] ^           +           <           <=          ==         
[16] >           >=          |           abs         acos       
[21] all         any         aperm       Arg         asin       
[26] atan        cbind       ceiling     Conj        cos        
[31] cospi       digamma     dim         exp         expm1      
[36] ground       Im          is.finite   is.infinite is.nan     
[41] size      lgamma      log         log10       log1p      
[46] log2        max         imply        min         Mod        
[51] print       prod        vary       rbind       Re         
[56] rep         spherical       signal        sin         sinpi      
[61] kind        sqrt        str         sum         t          
[66] tan         tanpi      

Which means typically you’ll be able to write the identical code for TensorFlow Tensors
as you’ll for R arrays. For instance, think about this small perform
from Chapter 11 of the ebook:

reweight_distribution <-
  perform(original_distribution, temperature = 0.5) {
    original_distribution %>%
      { exp(log(.) / temperature) } %>%
      { . / sum(.) }
  }

Observe that features like reweight_distribution() work with each 1D R
vectors and 1D TensorFlow Tensors, since exp(), log(), /, and
sum() are all R generics with strategies for TensorFlow Tensors.

In the identical vein, this Keras launch brings with it a refinement to the
method customized class extensions to Keras are outlined. Partially impressed by
the brand new R7 syntax, there’s a
new household of features: new_layer_class(), new_model_class(),
new_metric_class(), and so forth. This new interface considerably
simplifies the quantity of boilerplate code required to outline customized
Keras extensions—a pleasing R interface that serves as a facade over
the mechanics of sub-classing Python courses. This new interface is the
yang to the yin of %py_class%–a option to mime the Python class
definition syntax in R. After all, the “uncooked” API of changing an
R6Class() to Python by way of r_to_py() continues to be accessible for customers that
require full management.

This launch additionally brings with it a cornucopia of small enhancements
all through the Keras R interface: up to date print() and plot() strategies
for fashions, enhancements to freeze_weights() and load_model_tf(),
new exported utilities like zip_lists() and %<>%. And let’s not
neglect to say a brand new household of R features for modifying the educational
price throughout coaching, with a set of built-in schedules like
learning_rate_schedule_cosine_decay(), complemented by an interface
for creating customized schedules with new_learning_rate_schedule_class().

You could find the complete launch notes for the R packages right here:

The discharge notes for the R packages inform solely half the story nonetheless.
The R interfaces to Keras and TensorFlow work by embedding a full Python
course of in R (by way of the
reticulate package deal). Considered one of
the most important advantages of this design is that R customers have full entry to
the whole lot in each R and Python. In different phrases, the R interface
all the time has function parity with the Python interface—something you’ll be able to
do with TensorFlow in Python, you are able to do in R simply as simply. This implies
the discharge notes for the Python releases of TensorFlow are simply as
related for R customers:

Thanks for studying!

Picture by Raphael
Wild

on
Unsplash

Reuse

Textual content and figures are licensed beneath Inventive Commons Attribution CC BY 4.0. The figures which have been reused from different sources do not fall beneath this license and might be acknowledged by a observe of their caption: “Determine from …”.

Quotation

For attribution, please cite this work as

Kalinowski (2022, June 9). Posit AI Weblog: TensorFlow and Keras 2.9. Retrieved from https://blogs.rstudio.com/tensorflow/posts/2022-06-09-tf-2-9/

BibTeX quotation

@misc{kalinowskitf29,
  writer = {Kalinowski, Tomasz},
  title = {Posit AI Weblog: TensorFlow and Keras 2.9},
  url = {https://blogs.rstudio.com/tensorflow/posts/2022-06-09-tf-2-9/},
  yr = {2022}
}


Simplifying Transformers: State of the Artwork NLP Utilizing Phrases You Perceive— Half 1 — Intro | by Chen Margalit | Jul, 2023

Researching a Multilingual FEMA Catastrophe Bot Utilizing LangChain and GPT-4 | by Matthew Harris | Aug, 2023