Human Generated Data

Title

Untitled (seated couple eating Campbell's Soup outdoors)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5288

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (seated couple eating Campbell's Soup outdoors)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 98.7
Human 98.7
Furniture 97.3
Chair 97.3
Person 97.2
Clothing 80.3
Apparel 80.3
Transportation 80
Vehicle 80
Face 76.7
Outdoors 74.8
Automobile 73.6
Sports Car 73.6
Car 73.6
People 69.9
Portrait 68.3
Photography 68.3
Photo 68.3
Table 67.3
Nature 65.3
Shorts 62.3
Plant 59.7
Road 59.5
Sedan 57.2
Female 56.3
Coupe 55.5
Sitting 55.1
Person 51.2

Imagga
created on 2022-01-22

boat 40.4
sea 30.5
ship 28.8
vessel 26.8
water 26
ocean 24
harbor 20.2
vehicle 19
port 16.4
travel 16.2
transportation 16.1
dock 15.6
transport 15.5
sky 14
old 13.9
beach 12.6
sail 12.6
coast 12.6
craft 12.1
machine 12
brass 12
industry 11.9
sailing 11.7
boats 11.6
tourism 11.5
equipment 11.5
device 11.3
summer 10.9
building 10.6
gondola 10.5
construction 10.3
fishing 9.6
cloud 9.5
work 9.4
winter 9.4
shore 9.3
house 9.2
black 9
marina 9
vacation 9
nautical 8.7
bay 8.7
wind instrument 8.6
cold 8.6
snow 8.6
business 8.5
industrial 8.2
musical instrument 8.1
sun 8
history 8
mast 7.8
ships 7.8
pier 7.8
wave 7.8
power 7.5
traditional 7.5
landscape 7.4
waves 7.4
sport 7.4
light 7.3
island 7.3
tourist 7.2
metal 7.2
scenery 7.2
sunset 7.2
holiday 7.2
river 7.1
worker 7.1
day 7.1
wooden 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

outdoor 97.4
text 96.5
man 91.8
clothing 88.4
person 88.3
black and white 66.1
ship 65.8
tent 62.3
posing 48

Face analysis

Amazon

AWS Rekognition

Age 31-41
Gender Female, 66.3%
Happy 51%
Calm 28.9%
Sad 13.5%
Angry 3.2%
Surprised 1.1%
Fear 0.9%
Disgusted 0.9%
Confused 0.5%

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a man holding a sign posing for the camera 56.5%
a man sitting on a bed 51.1%
a man sitting on a bench 51%

Text analysis

Amazon

arec