Human Generated Data

Title

Untitled (man seated with bluebird)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10707

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man seated with bluebird)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Furniture 100
Chair 100
Human 98.2
Person 98.2
Table 87.7
Dining Table 84.6
Clothing 74.3
Apparel 74.3
Female 74.1
Plant 73.9
Face 73
Grass 69
Outdoors 68.6
Portrait 65.2
Photography 65.2
Photo 65.2
Girl 64.3
Sitting 58.3

Imagga
created on 2022-01-15

snow 30.9
newspaper 28.2
winter 27.2
cold 25
people 21.7
person 21.3
product 20.5
outdoors 19.6
man 18.8
creation 16.8
lifestyle 15.2
snowy 14.6
adult 14.6
outdoor 14.5
bench 14.4
park 14.3
landscape 14.1
happy 13.8
frost 13.4
frozen 13.4
ice 13.3
tree 13.1
outside 12.8
male 12.8
weather 12.6
season 12.5
scene 12.1
sitting 12
smile 11.4
work 11.4
smiling 10.8
holding 10.7
cool 10.7
forest 10.4
sky 10.2
happiness 10.2
casual 10.2
negative 10.1
vacation 9.8
beach 9.8
freeze 9.7
summer 9.6
laptop 9.6
seat 9.4
youth 9.4
water 9.3
building 9.2
travel 9.2
city 9.1
child 9.1
professional 9.1
hand 9.1
portrait 9.1
trees 8.9
job 8.8
sand 8.8
women 8.7
day 8.6
men 8.6
business 8.5
senior 8.4
black 8.4
old 8.4
scholar 8.2
fun 8.2
relaxing 8.2
transportation 8.1
snowing 7.9
couple 7.8
frosty 7.8
pretty 7.7
reading 7.6
handcart 7.5
fashion 7.5
computer 7.3
film 7.2
chair 7.2
holiday 7.2
park bench 7.2
mountain 7.1
working 7.1
seasonal 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 96.4
outdoor 90.2
person 70.6
black and white 63.5
clothing 63

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Male, 97.5%
Calm 39.5%
Sad 25%
Surprised 16.3%
Confused 12.8%
Angry 2.7%
Disgusted 1.4%
Fear 1.2%
Happy 1.1%

Feature analysis

Amazon

Person 98.2%

Captions

Microsoft

a person sitting on a bench reading a book 33.9%
a person sitting on a bench reading a book 33.8%

Text analysis

Amazon

ae
KODYK-20LEIA

Google

YT37A2- AG
YT37A2-
AG