Human Generated Data

Title

Untitled (Campbell Soup advertisement: couple getting into chauffeured car)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8848

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Campbell Soup advertisement: couple getting into chauffeured car)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Plant 99.2
Human 98.7
Person 98.7
Tree 87.9
Apparel 85.2
Clothing 85.2
Person 84.9
Building 76.3
Architecture 76.3
Person 74.9
Coat 74.3
Suit 74.3
Overcoat 74.3
Arecaceae 71.4
Palm Tree 71.4
Transportation 71.2
Vehicle 70.9
Food 70.4
Pineapple 70.4
Fruit 70.4
Car 66.5
Automobile 66.5
Female 59.7
Door 58.5
Gown 56.8
Fashion 56.8
Bridegroom 55.2
Wedding 55.2
Pedestrian 55

Imagga
created on 2022-01-15

turnstile 100
gate 93.9
movable barrier 71.1
barrier 51.8
architecture 51.1
building 48.1
city 32.4
obstruction 27.3
window 25.6
structure 22.7
old 21.6
urban 21
house 20.1
tourism 18.2
town 16.7
door 16.3
travel 16.2
street 15.6
historic 15.6
construction 15.4
column 14.4
stone 14.3
balcony 14
interior 13.3
ancient 13
home 12.8
history 12.5
windows 12.5
facade 12.5
wall 12
modern 11.9
sky 11.5
famous 11.2
landmark 10.8
steel 10.6
buildings 10.4
historical 10.3
glass 10.1
tourist 10
religion 9.9
tower 9.8
arch 9.5
perspective 9.4
light 9.4
destination 9.4
monument 9.3
roof 8.7
entrance 8.7
palace 8.7
residential 8.6
culture 8.5
business 8.5
plaza 8.5
design 8.4
inside 8.3
office 8.2
metal 8
ceiling 7.9
museum 7.8
built 7.7
industry 7.7
brick 7.5
exterior 7.4
vacation 7.4
indoor 7.3
new 7.3
art 7.2

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 96.3
outdoor 94.6
black and white 84.6
wedding 75.6
white 72
tree 55.1
window 18.2

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 91.8%
Happy 84.6%
Calm 10.6%
Sad 3%
Surprised 0.5%
Confused 0.4%
Disgusted 0.3%
Angry 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a sign on the side of a building 81.4%
a store inside of a building 81.3%
a person standing in front of a store window 60.7%

Text analysis

Amazon

39502
STA
YТ37°-X

Google

STA YT37A°2- AGO 39502
STA
AGO
39502
YT37A°2-