Human Generated Data

Title

Untitled (man speaking from podium to large crowd)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7084

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man speaking from podium to large crowd)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7084

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 95.7
Person 95.5
Person 95
Crowd 91.2
People 89.9
Person 87.1
Person 85.1
Building 76.9
Sailor Suit 65.1
Military 62.4
Audience 62.3
Person 58.3
Person 44

Clarifai
created on 2023-10-15

people 99.9
crowd 99.7
many 99.4
audience 98.8
man 98.1
group 96.8
chair 96.6
group together 94.1
adult 93.2
military 90.5
leader 90
war 89.2
music 87.1
meeting 86.7
administration 84.8
cemetery 84.3
seat 83.6
soldier 83.1
league 82.7
funeral 82.7

Imagga
created on 2021-12-15

snow 67.6
city 40.8
weather 39.4
architecture 25
buildings 23.6
aerial 23.3
urban 21.9
cityscape 21.8
building 21.1
town 20.4
travel 19
gymnasium 19
landscape 18.6
spectator 16.9
athletic facility 15.2
tourism 14.9
houses 14.5
panorama 14.3
sky 14
skyline 13.3
facility 13.3
mountain 13
outdoors 12.7
old 11.8
river 11.7
tower 11.6
above 11.6
downtown 11.5
water 11.3
outside 11.1
mountains 11.1
cities 10.8
scenic 10.5
cold 10.3
lake 10.3
church 10.2
structure 10.1
sea 9.4
center 9.2
house 9.2
history 8.9
high 8.7
scene 8.7
sketch 8.7
panoramic 8.6
roof 8.6
construction 8.6
winter 8.5
street 8.3
trees 8
color 7.8
scenics 7.7
outdoor 7.6
horizontal 7.5
stall 7.5
exterior 7.4
vacation 7.4
light 7.4
tourist 7.4
drawing 7.3
coast 7.2
summer 7.1
day 7.1
modern 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 96.5
outdoor 92.8
white 74.2
ship 71.5
black and white 67.6
old 53.9
posing 53
crowd 1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 47-65
Gender Female, 50.8%
Calm 87%
Happy 6.3%
Confused 2.4%
Sad 1.4%
Surprised 1.1%
Angry 0.7%
Fear 0.6%
Disgusted 0.5%

AWS Rekognition

Age 18-30
Gender Male, 69.8%
Sad 85.2%
Calm 12%
Confused 0.9%
Happy 0.8%
Angry 0.7%
Fear 0.2%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 53-71
Gender Male, 78.2%
Fear 44.2%
Calm 17%
Confused 10.2%
Happy 9.1%
Surprised 8.2%
Sad 6.3%
Angry 2.9%
Disgusted 1.9%

AWS Rekognition

Age 50-68
Gender Male, 86.4%
Calm 83.8%
Surprised 7.2%
Happy 2%
Sad 1.9%
Confused 1.8%
Angry 1.4%
Disgusted 1%
Fear 0.9%

AWS Rekognition

Age 50-68
Gender Female, 56.8%
Happy 79.7%
Calm 17.9%
Sad 0.9%
Angry 0.5%
Surprised 0.3%
Confused 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 54-72
Gender Male, 73.9%
Calm 97.2%
Sad 1.5%
Surprised 0.4%
Confused 0.3%
Angry 0.3%
Disgusted 0.2%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 47-65
Gender Female, 66.2%
Calm 70.4%
Happy 18.4%
Sad 5.4%
Confused 1.6%
Angry 1.6%
Disgusted 1.3%
Surprised 1%
Fear 0.3%

Feature analysis

Amazon

Person 95.5%

Categories

Text analysis

Amazon

20840.

Google

20840 - 20840 •
20840
-