Human Generated Data

Title

Untitled (women behind counter in front of "Heinz 57" display wall)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4415

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women behind counter in front of "Heinz 57" display wall)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4415

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 98.5
Person 98
Person 97.3
Indoors 95.1
Interior Design 95.1
Person 94.9
Person 94.4
Person 94.1
Person 92.7
Person 91.8
Person 91.2
Person 90.2
Crowd 80.5
Person 79.8
Person 76.7
Audience 74.6
Text 69.9
Person 69.8
People 68.5
Face 67.7
Clothing 60.9
Apparel 60.9
Person 57.7
Person 57.6
Person 56.6
Poster 55.7
Advertisement 55.7
Shop 55.3
Word 55.2

Clarifai
created on 2023-10-26

people 99.7
group 97.7
many 97.3
adult 96.9
crowd 96.7
man 95.9
monochrome 93.5
administration 92.8
group together 92.2
leader 90.4
woman 90.1
war 88.9
chair 84.9
street 79.2
theater 77.7
building 75.5
music 73.9
horizontal plane 70.8
audience 70.5
transportation system 68.9

Imagga
created on 2022-01-23

city 33.3
building 31.8
hall 30.5
center 27.7
architecture 27.7
university 25.8
office 23.2
urban 19.2
skyline 17.1
business 17
sky 16.6
skyscraper 16.4
travel 16.2
structure 16.2
cityscape 16.1
buildings 16.1
night 16
classroom 14.8
modern 14.7
newspaper 14.5
financial 13.4
skyscrapers 12.7
cinema 12.7
landmark 12.6
room 12
downtown 11.5
theater 11.3
people 11.2
product 11.1
light 10.7
design 10.7
new 10.5
bank 10.5
famous 10.2
money 10.2
finance 10.1
tower 9.8
tourist 9.8
old 9.8
windows 9.6
construction 9.4
town 9.3
street 9.2
banking 9.2
historic 9.2
stage 8.8
graphic 8.8
bridge 8.7
creation 8.6
vintage 8.3
paper 7.8
us 7.7
grunge 7.7
memorial 7.6
communication 7.6
tall 7.5
monument 7.5
rich 7.4
daily 7.4
cash 7.3
art 7.2
web site 7

Google
created on 2022-01-23

Black 89.7
Building 89.3
Black-and-white 87.6
Coat 85.5
Style 84.1
Line 82.7
Font 80.2
Monochrome 78.8
Monochrome photography 78.5
Crowd 77.5
Snapshot 74.3
Event 73.4
Suit 72.5
Facade 70
City 68.7
Room 68.2
History 67.5
Art 66.2
Stock photography 65.7
Commercial building 57.2

Microsoft
created on 2022-01-23

text 100
person 86.6
newspaper 77.6
clothing 72.5
man 70.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Female, 98.4%
Calm 99.7%
Sad 0.2%
Surprised 0%
Fear 0%
Happy 0%
Confused 0%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 36-44
Gender Male, 76.8%
Happy 82.9%
Sad 7.8%
Confused 4.4%
Calm 1.9%
Fear 1.3%
Surprised 0.7%
Disgusted 0.6%
Angry 0.5%

AWS Rekognition

Age 48-56
Gender Female, 78.8%
Calm 99.9%
Sad 0.1%
Surprised 0%
Confused 0%
Happy 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 99.2%
Happy 50.6%
Sad 28%
Calm 14.6%
Confused 2.7%
Surprised 2%
Disgusted 1.1%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 39-47
Gender Female, 80.2%
Sad 50.1%
Calm 41.5%
Happy 4.9%
Confused 2.5%
Angry 0.3%
Disgusted 0.2%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 37-45
Gender Male, 81%
Calm 98.9%
Sad 0.4%
Confused 0.4%
Happy 0.2%
Angry 0%
Surprised 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Poster 55.7%

Categories

Text analysis

Amazon

TO AN
TO AN INDIVIDUAL
INDIVIDUAL
SOUPS
-
HEINZ
HEINZ -
17269.
KODV
well
أ
EVEETA
by
1267.
-Send
DC
-Send Vintage
Vintage
M

Google

TO AN INDIVIDUA SOUPS ENZ 11267.
TO
AN
INDIVIDUA
SOUPS
ENZ
11267.