Human Generated Data

Title

Untitled (group of adults and children near edge of pool)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8846

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of adults and children near edge of pool)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8846

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.2
Human 99.2
Person 98.9
Person 96.7
Clothing 95.7
Apparel 95.7
Person 94.8
Person 92.7
Shorts 90.8
Person 88.5
People 80.7
Person 75.8
Person 73
Chair 71.4
Furniture 71.4
Pedestrian 64.9
Crowd 61.7
Porch 61.1
Leisure Activities 60.9
Photography 60.6
Photo 60.6
Plant 60.4
Kid 59.7
Child 59.7
Tree 59
Outdoors 58.6
Door 57.6
Stage 56.8
Play 56.5
Girl 55.9
Female 55.9
Coat 55.8

Clarifai
created on 2023-10-26

people 99.8
group together 98.4
child 98.4
group 97.3
many 96.1
adult 95.7
man 94.8
recreation 94.6
woman 94.1
monochrome 90.6
wear 88.6
enjoyment 87.9
music 87.5
boy 85.4
several 84.3
adolescent 81.9
dancing 81.4
dancer 80.6
family 80.4
education 79.2

Imagga
created on 2022-01-15

person 20.2
man 20.2
people 19
male 16.3
adult 15.8
silhouette 15.7
couple 14.8
newspaper 14.3
water 13.3
musical instrument 12.2
sport 11.9
sunset 11.7
portrait 11.6
fun 11.2
outdoors 11.2
product 11.2
child 11.1
world 10.8
teacher 10.6
love 10.3
creation 10.1
park 9.9
black 9.6
men 9.4
day 9.4
happy 9.4
dark 9.2
summer 9
sky 8.9
symbol 8.8
educator 8.7
lifestyle 8.7
happiness 8.6
beach 8.4
outdoor 8.4
leisure 8.3
art 8.2
dancer 8.1
activity 8.1
romance 8
stringed instrument 7.8
performer 7.7
sign 7.5
human 7.5
city 7.5
holding 7.4
business 7.3
protection 7.3
romantic 7.1
women 7.1
businessman 7.1
travel 7
together 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.8
dance 91.9
person 85.9
clothing 85.5
old 52.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 34-42
Gender Male, 53.5%
Sad 78.7%
Calm 13.4%
Confused 2.4%
Happy 2.3%
Surprised 1.5%
Disgusted 1%
Angry 0.5%
Fear 0.3%

AWS Rekognition

Age 22-30
Gender Female, 98.4%
Happy 67.2%
Calm 23.7%
Sad 7.8%
Confused 0.3%
Fear 0.3%
Angry 0.3%
Surprised 0.3%
Disgusted 0.2%

AWS Rekognition

Age 24-34
Gender Male, 96.8%
Happy 98.3%
Calm 1.3%
Sad 0.2%
Surprised 0.1%
Angry 0%
Confused 0%
Disgusted 0%
Fear 0%

Feature analysis

Amazon

Person 99.2%

Text analysis

Amazon

39483-B

Google

39483-0 T3RA 2--AGO
39483-0
T3RA
2--AGO