Human Generated Data

Title

Untitled (woman in the back of a pick-up truck eating a sandwich)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8801

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman in the back of a pick-up truck eating a sandwich)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.3
Human 98.3
Clothing 92.4
Apparel 92.4
Transportation 74.9
Furniture 74.5
Vehicle 74.1
Car 72.6
Automobile 72.6
Advertisement 69.7
Poster 67
Photo 64
Face 64
Portrait 64
Photography 64
Collage 60.9
Shoe 59.8
Footwear 59.8
Couch 59
Female 56.1
Shorts 55.5

Imagga
created on 2022-01-09

chair 27.4
people 26.2
seat 24.8
device 24.1
computer 21.9
adult 21.4
office 20.2
person 20
laptop 19.7
support 19.3
business 18.8
happy 18.8
work 18
smile 17.1
working 16.8
home 16.7
printer 16.3
sitting 16.3
equipment 16.3
smiling 15.9
portrait 15.5
lady 15.4
machine 15.2
house 15
interior 15
negative 15
attractive 14.7
rest 14.5
sexy 14.5
armrest 14.4
women 14.2
keyboard 14.1
indoors 14.1
furniture 13.4
technology 13.4
room 13.3
pretty 13.3
fashion 12.8
hair 12.7
worker 12.4
model 12.4
job 12.4
armchair 12.3
film 12.1
one 11.9
sofa 11.7
blond 11.6
notebook 10.9
communication 10.9
modern 10.5
monitor 10.4
black 10.2
lifestyle 10.1
man 10.1
sensual 10
desk 9.8
luxury 9.4
rocking chair 9.2
leisure 9.1
indoor 9.1
businesswoman 9.1
piano 9
cheerful 8.9
table 8.7
brunette 8.7
concentration 8.7
photographic paper 8.6
face 8.5
back 8.3
home appliance 8.1
looking 8
look 7.9
car 7.7
youth 7.7
casual 7.6
elegance 7.6
human 7.5
peripheral 7.4
white goods 7.3
clothing 7.3
photographic equipment 7.3
design 7.3
success 7.2
body 7.2
happiness 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 95.9
outdoor 89.7
vehicle 83.8
car 82.1
land vehicle 64
black and white 58.6
clothing 50.8

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Female, 61.6%
Calm 49.3%
Happy 44.3%
Sad 2.1%
Surprised 2%
Confused 0.8%
Disgusted 0.6%
Angry 0.5%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%

Text analysis

Amazon

MJI7--YT37S--

Google

MJI7--YT A2--AGO
MJI7--YT
A2--AGO