Human Generated Data

Title

Untitled (woman standing on a table in outdoor seating area, Colony Beach Resort)

Date

1960

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11471

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman standing on a table in outdoor seating area, Colony Beach Resort)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11471

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Person 99.4
Human 99.4
Person 99.4
Restaurant 92.3
Person 90.9
Person 86.1
Person 84.4
Cafeteria 83.8
Room 83.5
Indoors 83.5
Tabletop 81
Furniture 81
Person 79.8
Person 74.7
Person 74.2
Person 72.8
Table 70.1
People 68.3
Crowd 66.9
Person 64.8
Dining Table 61.1
Person 59.1
Workshop 57.1
Classroom 55.9
School 55.9

Clarifai
created on 2023-10-25

people 100
many 99.8
group 99.3
group together 99.1
adult 98.2
man 98.1
crowd 97.9
sit 95.1
child 94
chair 93.9
audience 93.9
woman 93.7
furniture 93.6
administration 92.3
boy 90.6
spectator 90.1
recreation 89.6
sitting 89.5
war 87.4
seat 87

Imagga
created on 2022-01-14

blackboard 28.3
newspaper 21.5
product 19
man 18.1
person 16.4
people 15.6
male 14.9
architecture 14.8
sky 14.7
city 14.1
classroom 14.1
business 14
creation 13.7
water 13.3
building 13.2
room 13.1
daily 12.5
travel 12
house 11.7
black 10.8
scene 10.4
drawing 10.3
construction 10.3
design 10.1
adult 10
tourism 9.9
businessman 9.7
outdoors 9.7
urban 9.6
men 9.4
day 9.4
silhouette 9.1
transportation 9
group 8.9
job 8.8
work 8.8
light 8.7
structure 8.6
grunge 8.5
stage 8.4
old 8.4
human 8.2
symbol 8.1
cloud 7.7
modern 7.7
outside 7.7
chart 7.6
outdoor 7.6
evening 7.5
history 7.2
glass 7.1

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

text 99.6
drawing 84.1
person 71.5
posing 66.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-38
Gender Female, 70.1%
Happy 99.6%
Calm 0.2%
Sad 0.1%
Disgusted 0%
Angry 0%
Confused 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 52-60
Gender Male, 98.2%
Calm 96.8%
Sad 2.4%
Happy 0.2%
Disgusted 0.2%
Angry 0.2%
Confused 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 21-29
Gender Male, 61.1%
Calm 89.8%
Sad 3.9%
Disgusted 3.1%
Fear 1.7%
Angry 0.5%
Happy 0.5%
Surprised 0.3%
Confused 0.3%

AWS Rekognition

Age 45-51
Gender Female, 94%
Happy 45.3%
Sad 31%
Fear 8%
Angry 5.2%
Calm 4.9%
Disgusted 3.1%
Confused 1.7%
Surprised 0.7%

AWS Rekognition

Age 39-47
Gender Male, 63.4%
Calm 68.8%
Sad 26.8%
Happy 1.2%
Confused 1.2%
Fear 0.9%
Disgusted 0.4%
Angry 0.4%
Surprised 0.3%

AWS Rekognition

Age 24-34
Gender Female, 74.6%
Happy 28%
Calm 24.4%
Surprised 13.3%
Fear 10.5%
Confused 10.4%
Sad 7%
Disgusted 4.1%
Angry 2.2%

AWS Rekognition

Age 24-34
Gender Female, 93.2%
Happy 37.6%
Calm 35.2%
Sad 20.3%
Fear 2.4%
Disgusted 1.6%
Angry 1.2%
Surprised 1%
Confused 0.7%

AWS Rekognition

Age 37-45
Gender Female, 86.5%
Calm 99.4%
Angry 0.2%
Sad 0.1%
Happy 0.1%
Disgusted 0%
Surprised 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 24-34
Gender Male, 98.9%
Happy 89.1%
Disgusted 3.7%
Calm 2%
Sad 1.3%
Fear 1.2%
Surprised 1.1%
Confused 0.9%
Angry 0.8%

Feature analysis

Amazon

Person 99.4%

Categories

Text analysis

Amazon

EPTEN