Human Generated Data

Title

Untitled (wedding guests standing outside of church)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10733

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests standing outside of church)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10733

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.5
Human 99.5
Person 99.5
Apparel 99.2
Clothing 99.2
Person 99.2
Person 98.9
Person 98.3
Person 97.9
Person 95.8
Person 95.4
Person 94.3
Pedestrian 93.5
Person 91.6
Person 91.4
People 91.1
Face 88
Suit 87.2
Overcoat 87.2
Coat 87.2
Person 86.6
Dress 86.6
Person 86.1
Robe 84
Fashion 84
Gown 82.3
Transportation 81.4
Car 81.4
Vehicle 81.4
Automobile 81.4
Crowd 79.7
Wedding 78
Home Decor 70.6
Bridegroom 69.7
Wheel 68.7
Machine 68.7
Wedding Gown 68
Porch 67.8
Urban 66.9
Person 65.8
Female 65.1
Footwear 63.7
Shoe 63.7
Photography 60.6
Photo 60.6
Tuxedo 57.3
Bride 55.6
Person 55.5
Woman 55.4

Clarifai
created on 2023-10-26

people 99.9
group 99.4
many 99.2
group together 99
man 97.5
administration 95.7
leader 95.7
adult 95.6
woman 95.5
child 92.5
family 88.8
several 88.5
home 88.3
crowd 86.8
outfit 85.2
vehicle 81.4
spectator 80.2
chair 79.9
street 79.8
war 77.9

Imagga
created on 2022-01-15

people 19
city 15
person 14.9
man 14.8
musical instrument 12.9
business 12.8
transportation 12.5
dark 10.9
adult 10.8
male 10.6
urban 10.5
window 10.4
men 10.3
world 10.3
percussion instrument 10.1
dirty 9.9
silhouette 9.9
sport 9.9
travel 9.9
black 9.6
life 9.6
scene 9.5
transport 9.1
industrial 9.1
style 8.9
crowd 8.6
walk 8.6
portrait 8.4
clothing 8.4
outdoor 8.4
power 8.4
building 8.4
danger 8.2
chair 7.9
women 7.9
train 7.7
room 7.4
street 7.4
music 7.2
vehicle 7.1
businessman 7.1
architecture 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

building 99
outdoor 98.6
text 97.5
person 96.7
clothing 93.1
woman 80.4
house 77.8
man 69.8
dress 69.8
group 66.8
people 65.5
store 32.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 97.3%
Calm 41%
Confused 36%
Sad 11.3%
Surprised 4%
Angry 2.5%
Happy 2.4%
Fear 2%
Disgusted 0.8%

AWS Rekognition

Age 34-42
Gender Male, 84.2%
Happy 86.8%
Sad 9%
Calm 1.8%
Surprised 1.5%
Confused 0.5%
Angry 0.2%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 16-24
Gender Male, 98.5%
Happy 47.1%
Calm 24.4%
Sad 11.9%
Fear 6.4%
Angry 4.9%
Surprised 3.5%
Disgusted 1%
Confused 0.9%

AWS Rekognition

Age 16-24
Gender Female, 75.2%
Fear 38.7%
Calm 22.8%
Surprised 13.4%
Disgusted 9.7%
Sad 6%
Angry 4%
Confused 3.4%
Happy 2%

AWS Rekognition

Age 42-50
Gender Male, 96.7%
Calm 82.9%
Sad 11%
Surprised 2.8%
Happy 1.9%
Disgusted 0.5%
Angry 0.4%
Fear 0.3%
Confused 0.2%

AWS Rekognition

Age 16-22
Gender Male, 68.1%
Calm 61.1%
Sad 27.7%
Confused 7.3%
Happy 1%
Surprised 1%
Angry 0.8%
Fear 0.6%
Disgusted 0.5%

AWS Rekognition

Age 34-42
Gender Male, 95.3%
Calm 84.8%
Disgusted 7.2%
Sad 4.1%
Happy 2.5%
Surprised 0.5%
Confused 0.4%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 18-24
Gender Male, 98.1%
Calm 99.8%
Happy 0.1%
Sad 0.1%
Confused 0%
Surprised 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 92.3%
Calm 95.3%
Happy 3.2%
Sad 0.4%
Fear 0.3%
Confused 0.2%
Angry 0.2%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 38-46
Gender Male, 99.5%
Calm 64.4%
Happy 19.9%
Sad 8.8%
Confused 2.8%
Disgusted 1.7%
Surprised 0.9%
Angry 0.8%
Fear 0.7%

AWS Rekognition

Age 19-27
Gender Female, 57.8%
Sad 62%
Happy 19.5%
Calm 10%
Confused 2.3%
Disgusted 1.8%
Angry 1.7%
Fear 1.6%
Surprised 1.2%

AWS Rekognition

Age 33-41
Gender Female, 72.3%
Calm 47.3%
Happy 39%
Sad 3.5%
Confused 2.8%
Fear 2.8%
Angry 1.8%
Surprised 1.6%
Disgusted 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Car 81.4%
Wheel 68.7%
Shoe 63.7%

Categories

Text analysis

Amazon

35859
asa