Human Generated Data

Title

Untitled (bride and groom outside the church)

Date

1965, printed later

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1048

Human Generated Data

Title

Untitled (bride and groom outside the church)

People

Artist: Samuel Cooper, American active 1950s

Date

1965, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.6
Apparel 99.6
Human 98.5
Person 98.5
Person 98.5
Person 98.2
Person 98.1
Person 97.3
Person 95.8
Person 95.3
Person 86.3
Fashion 82.3
Gown 82.3
Robe 79.4
Wedding 78.8
Person 76.3
Military Uniform 76.2
Military 76.2
Officer 76.2
Overcoat 75.4
Coat 75.4
Person 71.4
Wedding Gown 70.8
Suit 69.6
Funeral 67.4
People 58.6
Crowd 58.4
Bride 58

Imagga
created on 2021-12-14

brass 40.2
man 33.6
wind instrument 33.3
couple 32.2
male 31.2
people 30.7
groom 26.4
musical instrument 26.1
men 24
bride 24
person 23.6
happy 23.2
happiness 21.2
wedding 21.2
adult 20.6
device 20.4
businessman 17.7
business 17.6
megaphone 17.4
love 17.4
smiling 17.4
married 17.3
dress 17.2
marriage 17.1
together 16.6
smile 16.4
trombone 16.3
women 15
group 14.5
acoustic device 14.4
suit 14.4
cornet 14.1
two 13.5
team 13.4
businesswoman 12.7
meeting 12.2
new 12.1
office 12
corporate 12
professional 12
husband 11.9
day 11.8
portrait 11.6
family 11.6
park 11.5
bouquet 11.3
success 11.3
outdoors 11.2
job 10.6
executive 10.5
talking 10.5
home 10.4
military uniform 10.2
mature 10.2
teamwork 10.2
worker 10
hand 9.9
uniform 9.8
clothing 9.7
flowers 9.6
hands 9.6
wife 9.5
senior 9.4
horn 9.4
room 9.3
manager 9.3
successful 9.1
modern 9.1
attractive 9.1
summer 9
lady 8.9
romantic 8.9
newly 8.9
discussion 8.8
colleagues 8.7
table 8.7
boss 8.6
businesspeople 8.5
pair 8.5
human 8.2
celebration 8
lifestyle 7.9
bridegroom 7.9
work 7.8
ceremony 7.8
partner 7.7
outdoor 7.6
formal 7.6
casual 7.6
life 7.5
fun 7.5
cheerful 7.3
looking 7.2
religion 7.2
building 7.1
handsome 7.1

Google
created on 2021-12-14

Outerwear 95.2
Coat 92.5
Hat 89.9
Tree 89.5
Gesture 85.3
Motor vehicle 84.9
Black-and-white 84.8
Style 84
Suit 81.8
Headgear 81.1
Tie 79.6
Vehicle 78.2
Classic 76.5
Monochrome photography 76.2
Sun hat 75.8
Monochrome 75.7
Vintage clothing 75.5
Formal wear 75
Event 74.3
Fedora 71.3

Microsoft
created on 2021-12-14

person 100
tree 99.4
outdoor 97.6
clothing 96.7
man 92.7
people 90.6
standing 89
group 84.6
posing 66.8
musical instrument 60.1
black and white 56.8
old 45
megaphone 18.1
clothes 15.9
crowd 3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-54
Gender Male, 91.7%
Happy 91.5%
Calm 2.7%
Surprised 1.5%
Disgusted 1.3%
Angry 1%
Sad 1%
Fear 0.6%
Confused 0.5%

AWS Rekognition

Age 51-69
Gender Female, 99.5%
Happy 99.9%
Calm 0%
Surprised 0%
Angry 0%
Sad 0%
Confused 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 19-31
Gender Female, 57.1%
Sad 25.5%
Disgusted 21.2%
Angry 15.5%
Happy 15.1%
Calm 13.1%
Surprised 5.4%
Confused 3.3%
Fear 0.8%

AWS Rekognition

Age 26-42
Gender Male, 94.6%
Calm 64.1%
Sad 13.7%
Angry 10.4%
Confused 7.4%
Happy 2%
Disgusted 1.2%
Surprised 0.8%
Fear 0.5%

AWS Rekognition

Age 5-15
Gender Male, 85.5%
Fear 81%
Sad 16.5%
Calm 1.2%
Angry 0.5%
Happy 0.3%
Surprised 0.2%
Confused 0.2%
Disgusted 0%

AWS Rekognition

Age 54-72
Gender Male, 87.3%
Calm 54%
Happy 25.1%
Sad 14.5%
Confused 2%
Angry 1.3%
Surprised 1.3%
Fear 1%
Disgusted 0.8%

AWS Rekognition

Age 26-40
Gender Male, 98.5%
Happy 70%
Calm 13.1%
Confused 5.4%
Disgusted 4.7%
Sad 4.1%
Surprised 1.8%
Angry 0.4%
Fear 0.4%

AWS Rekognition

Age 38-56
Gender Male, 96%
Happy 43.8%
Calm 28.7%
Disgusted 9.8%
Angry 6.8%
Surprised 4.1%
Confused 2.8%
Fear 2.4%
Sad 1.6%

AWS Rekognition

Age 35-51
Gender Male, 59.9%
Calm 36.8%
Happy 21.1%
Angry 13.4%
Sad 6.7%
Disgusted 6.3%
Confused 6.2%
Fear 4.9%
Surprised 4.6%

AWS Rekognition

Age 39-57
Gender Female, 92.2%
Happy 99.5%
Surprised 0.1%
Calm 0.1%
Fear 0.1%
Angry 0.1%
Sad 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 46-64
Gender Male, 95.9%
Calm 99.3%
Sad 0.4%
Surprised 0.1%
Confused 0.1%
Angry 0%
Fear 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 42-60
Gender Male, 88.3%
Happy 66.7%
Calm 31.1%
Surprised 0.6%
Confused 0.5%
Sad 0.4%
Angry 0.3%
Disgusted 0.3%
Fear 0.1%

AWS Rekognition

Age 38-56
Gender Male, 88.5%
Happy 95.7%
Surprised 1.5%
Sad 0.8%
Fear 0.7%
Calm 0.6%
Angry 0.3%
Confused 0.3%
Disgusted 0.2%

AWS Rekognition

Age 35-51
Gender Female, 64.3%
Happy 64.3%
Calm 26.3%
Sad 5.7%
Angry 1.5%
Disgusted 0.7%
Surprised 0.7%
Confused 0.5%
Fear 0.3%

AWS Rekognition

Age 50-68
Gender Male, 79%
Calm 64.4%
Confused 11.8%
Angry 8.1%
Sad 6.9%
Happy 4.5%
Disgusted 2%
Surprised 1.8%
Fear 0.5%

AWS Rekognition

Age 46-64
Gender Male, 63.6%
Fear 27.8%
Calm 15.7%
Surprised 15.5%
Happy 14%
Angry 12.8%
Disgusted 7.1%
Sad 5.5%
Confused 1.5%

AWS Rekognition

Age 38-56
Gender Male, 96.8%
Calm 82.3%
Sad 12.9%
Confused 1.3%
Happy 1.2%
Angry 0.9%
Disgusted 0.5%
Fear 0.4%
Surprised 0.4%

Microsoft Cognitive Services

Age 55
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Suit 69.6%

Captions

Microsoft

a group of people posing for a photo 98.2%
a group of people posing for the camera 98.1%
a group of people posing for a picture 98%