Human Generated Data

Title

Untitled (two men cooking in a log cabin over a portable stove)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5262

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two men cooking in a log cabin over a portable stove)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.7
Person 99.7
Person 99.5
Clothing 88.3
Apparel 88.3
Wood 83.8
Furniture 83.1
Chair 72.8
Meal 71.4
Food 71.4
Indoors 67.6
Room 67.6
Plywood 65.5
Home Decor 64.6
Dish 60
Face 58
Table 56.7
Shelf 56.5

Imagga
created on 2022-01-22

musical instrument 52.9
accordion 52
keyboard instrument 41.8
wind instrument 31.9
shop 29.7
old 23.7
mercantile establishment 22.4
religion 19.7
ancient 19
temple 18
statue 17.3
architecture 17.2
travel 16.2
barbershop 15.9
sculpture 15.5
bakery 15.2
place of business 14.9
religious 14
building 13.5
stone 13.5
tourism 13.2
art 12.4
traditional 11.6
gold 11.5
golden 11.2
culture 11.1
palace 10.6
china 10.3
device 10.2
historic 10.1
history 9.8
antique 9.8
seller 9.4
interior 8.8
man 8.7
house 8.3
equipment 8.3
city 8.3
tourist 8.3
work 8.2
landmark 8.1
metal 8
chair 7.7
wall 7.7
spiritual 7.7
god 7.6
structure 7.6
oriental 7.5
historical 7.5
monument 7.5
east 7.5
vintage 7.4
famous 7.4
tradition 7.4
retro 7.4
establishment 7.3
decoration 7.2

Google
created on 2022-01-22

Shirt 94
Window 93.4
Black 89.6
Black-and-white 86
Style 84
Monochrome 78.7
Monochrome photography 78.2
Building 75.7
T-shirt 74.9
Snapshot 74.3
House 70.2
Cooking 67.8
Room 67.7
Stock photography 66.2
Machine 64.7
Curtain 62.5
Pattern 52.8

Microsoft
created on 2022-01-22

text 96.5
person 95.9
clothing 95.2
man 86.3
black and white 83.1

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 78.7%
Calm 81.8%
Sad 13.2%
Confused 1.9%
Surprised 0.8%
Happy 0.7%
Fear 0.6%
Disgusted 0.5%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Chair 72.8%

Captions

Microsoft

a group of people standing in front of a building 71%
a man and a woman standing in front of a building 49.4%
a group of people in front of a building 49.3%

Text analysis

Amazon

I