Human Generated Data

Title

Untitled (wedding guests standing near cake table)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8749

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests standing near cake table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8749

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 96.1
Human 96.1
Person 96.1
Person 95
Person 94.7
Person 91.9
Meal 91.3
Food 91.3
Person 88.5
Clothing 87.3
Apparel 87.3
Shelter 85.9
Rural 85.9
Building 85.9
Nature 85.9
Outdoors 85.9
Countryside 85.9
Person 82.6
Person 81.6
Person 78
People 75.5
Person 74.7
Urban 73.5
Person 72.9
Architecture 72.6
Person 69.9
Person 69.1
Crowd 67.9
Dress 66.7
Person 64.6
Vacation 63.3
Plant 63.2
Painting 62.6
Art 62.6
City 61.8
Town 61.8
Photography 61.1
Photo 61.1
Leisure Activities 60.5
Person 60
Person 59.2
Picnic 58.6
Tree 58.3
Pedestrian 57.1
Dome 56.6
Silhouette 55.1

Clarifai
created on 2023-10-25

people 100
group 99.6
many 99.6
group together 98.5
child 97.8
crowd 97.4
man 96.9
woman 96.1
adult 95.9
war 93.8
recreation 88.4
boy 87.7
military 86.9
administration 86.8
several 84.8
leader 84.2
wear 84.2
merchant 83.5
furniture 81.9
art 81.4

Imagga
created on 2022-01-09

man 29
people 27.3
beach 24.7
male 20.7
water 20
sea 19.5
silhouette 19
summer 18
ocean 17.4
sky 17.2
sand 16.6
seller 16.5
men 15.4
sunset 15.3
person 15.3
leisure 14.9
outdoors 14.2
travel 14.1
couple 13.9
container 13.4
vacation 13.1
business 12.7
women 12.6
lifestyle 12.3
group 12.1
basket 11.9
shopping basket 11.8
active 11.2
fun 11.2
shore 11.1
sport 11.1
happiness 11
relax 10.9
holiday 10.7
sun 10.5
tourism 9.9
coast 9.9
kin 9.8
businessman 9.7
boy 9.6
dusk 9.5
pedestrian 9.5
walking 9.5
friends 9.4
tropical 9.4
evening 9.3
shopping cart 9.3
waves 9.3
adult 9.1
activity 9
happy 8.8
love 8.7
coastline 8.5
landscape 8.2
horizon 8.1
child 8.1
success 8
family 8
together 7.9
handcart 7.9
day 7.8
black 7.8
scene 7.8
life 7.8
youth 7.7
winter 7.7
outdoor 7.6
wheeled vehicle 7.6
walk 7.6
hand 7.6
friendship 7.5
city 7.5
tourist 7.3
lake 7.3
world 7.1
romantic 7.1
job 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.5
outdoor 97.4
clothing 97.3
person 94.4
black and white 90.5
woman 83.7
man 82.7
people 78.4
footwear 70.6
group 63.5
monochrome 59.4
drawing 57.8
crowd 20

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 40-48
Gender Male, 97.5%
Calm 98.7%
Sad 0.7%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 29-39
Gender Male, 90.3%
Sad 58%
Happy 23.6%
Calm 9.2%
Confused 6.3%
Disgusted 1.1%
Angry 0.8%
Surprised 0.7%
Fear 0.3%

AWS Rekognition

Age 10-18
Gender Male, 99.8%
Sad 59.2%
Calm 29.4%
Disgusted 6.1%
Happy 2.1%
Confused 1.4%
Fear 1.1%
Angry 0.4%
Surprised 0.3%

AWS Rekognition

Age 20-28
Gender Male, 75.2%
Calm 63.8%
Confused 11%
Sad 7.9%
Angry 5.7%
Surprised 3.9%
Disgusted 3.5%
Happy 3.3%
Fear 0.8%

AWS Rekognition

Age 25-35
Gender Male, 89.6%
Calm 89.7%
Happy 3.4%
Surprised 2.3%
Sad 1.9%
Fear 1.4%
Confused 0.6%
Disgusted 0.4%
Angry 0.3%

Feature analysis

Amazon

Person 96.1%
Painting 62.6%

Text analysis

Amazon

to
C6.SBE