Human Generated Data

Title

Untitled (women behind counter in front of "Heinz 57" display wall)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4414

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women behind counter in front of "Heinz 57" display wall)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4414

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 99.1
Person 99.1
Person 98.7
Person 98.5
Person 98.3
Person 98.3
Person 97.3
Person 96.3
Person 96.2
Interior Design 95.2
Indoors 95.2
Person 91.1
Person 87.1
Restaurant 82.1
Crowd 79.1
Room 78.1
Cafeteria 71.1
Person 70.8
People 69.9
Person 68.7
Person 68.5
Person 66
Court 61.6
Audience 58.9
Urban 58.6
Building 57.4
Cafe 55.7
Classroom 55.5
School 55.5
Text 55.3

Clarifai
created on 2023-10-26

people 99.9
many 99.3
group 99.3
adult 98.2
administration 97.5
man 96.7
woman 95.5
crowd 95
group together 94.2
monochrome 90.7
leader 90.5
chair 90.2
sit 81.6
vehicle 80
several 79.7
wear 79.3
war 78.6
music 78.6
audience 74.8
transportation system 74.7

Imagga
created on 2022-01-23

hall 55.8
city 40.7
architecture 40
building 35.7
university 30.6
center 29.4
travel 23.9
classroom 21.9
night 21.3
urban 20.1
buildings 18.9
cityscape 18
room 17.5
tourist 17.4
sky 17.2
skyline 17.1
landmark 16.2
famous 15.8
office 15.7
skyscraper 14.9
modern 14.7
downtown 14.4
old 13.9
town 13.9
people 13.4
tourism 13.2
business 12.7
skyscrapers 12.7
structure 12.5
construction 12
financial 11.6
new 11.3
scene 11.2
monument 11.2
tower 10.7
capital 9.5
light 9.4
lights 9.3
street 9.2
historic 9.2
square 9
newspaper 8.8
windows 8.6
money 8.5
tall 8.5
dark 8.3
bank 8.1
transportation 8.1
station 7.8
district 7.7
culture 7.7
panorama 7.6
finance 7.6
bridge 7.6
evening 7.5
row 7.4
landscape 7.4
church 7.4
window 7.3
design 7.3
holiday 7.2
house 7.2
history 7.2

Google
created on 2022-01-23

Photograph 94.1
Black 89.7
Building 83
Line 81.6
Font 80.8
Monochrome 77.2
Monochrome photography 74.7
Event 72.3
Chair 66.1
History 65.9
Stock photography 64.9
Suit 64.2
Art 64.1
Crowd 61.5
Room 60.6
Mixed-use 58
Paper product 57
Facade 56.9
Rectangle 56.8
Pattern 54.4

Microsoft
created on 2022-01-23

text 99.9
person 98
clothing 87.1
man 82.6
people 73.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 64.7%
Calm 86.2%
Sad 6.1%
Fear 1.9%
Surprised 1.8%
Angry 1.6%
Confused 0.9%
Happy 0.8%
Disgusted 0.7%

AWS Rekognition

Age 29-39
Gender Male, 98%
Sad 98%
Calm 0.8%
Confused 0.5%
Happy 0.4%
Disgusted 0.1%
Angry 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 99.3%
Calm 76.5%
Sad 19.3%
Happy 2%
Confused 1%
Angry 0.4%
Surprised 0.4%
Disgusted 0.4%
Fear 0.1%

AWS Rekognition

Age 43-51
Gender Female, 93.6%
Happy 75.1%
Calm 13.5%
Surprised 3.9%
Sad 3.1%
Confused 2.3%
Angry 1.2%
Disgusted 0.5%
Fear 0.4%

AWS Rekognition

Age 51-59
Gender Female, 52%
Sad 98.1%
Calm 0.8%
Confused 0.3%
Happy 0.3%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 48-54
Gender Male, 92.3%
Sad 96.2%
Calm 1.5%
Happy 1.4%
Confused 0.4%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Text analysis

Amazon

19265.
17265.
19265
RUDY
the
KIA
I'd

Google

7265.
7265.