Human Generated Data

Title

Untitled (wedding group portrait, Brookline, Massachusetts)

Date

1946, printed later

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.955

Human Generated Data

Title

Untitled (wedding group portrait, Brookline, Massachusetts)

People

Artist: Samuel Cooper, American active 1950s

Date

1946, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.955

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.9
Apparel 99.9
Person 98.8
Human 98.8
Person 98.7
Person 96.8
Person 96.7
Person 96.3
Person 94.6
Robe 93.8
Fashion 93.8
Person 93.1
Gown 91.6
Wedding 91
Wheel 84
Machine 84
Wedding Gown 80.8
Bride 78.8
Person 78.4
Overcoat 72.2
Coat 72.2
Suit 63.7
Car 63.1
Transportation 63.1
Vehicle 63.1
Automobile 63.1

Clarifai
created on 2023-10-15

people 100
wedding 99.7
group 98.9
group together 98.7
groom 98.4
bride 98.1
many 97.8
woman 97
adult 96.8
street 96.6
monochrome 96.4
man 94.5
wear 93.2
several 92.8
vehicle 92.1
ceremony 89
veil 89
dress 88.6
leader 85.7
portrait 84.3

Imagga
created on 2021-12-14

groom 30.1
bride 26
dress 25.3
people 25.1
love 23.7
wedding 22.1
church 20.3
person 19.8
adult 19.5
couple 18.3
man 17.5
married 16.3
religion 16.1
clothing 16
happiness 15.7
ceremony 14.5
women 14.2
flowers 13.9
marriage 13.3
bouquet 13.2
white 13.2
celebration 12.7
gown 12.7
old 12.5
religious 12.2
fashion 12.1
male 12
catholic 11.9
limousine 11.8
veil 11.7
outdoor 11.5
happy 11.3
two 11
car 10.9
life 10.8
statue 10.7
tourism 10.7
holy 10.6
together 10.5
historic 10.1
city 10
history 9.8
bridal 9.7
faith 9.6
day 9.4
architecture 9.4
clothes 9.4
monument 9.3
elegance 9.2
face 9.2
tourist 9.2
travel 9.2
attractive 9.1
outdoors 8.9
building 8.9
men 8.6
outside 8.5
walking 8.5
business 8.5
hand 8.3
suit 8.1
uniform 8
hair 7.9
pedestrian 7.9
nurse 7.8
chapel 7.8
smile 7.8
metropolitan 7.8
wall 7.7
formal 7.6
seller 7.6
wife 7.6
adults 7.6
tradition 7.4
cheerful 7.3
art 7.3
new 7.3
looking 7.2
romance 7.1
portrait 7.1
businessman 7.1

Google
created on 2021-12-14

Land vehicle 96.3
Vehicle 95.5
Car 95.5
Motor vehicle 90.7
Dress 83.7
Wheel 82.8
People 78.7
Classic 78.7
Monochrome 76.5
Monochrome photography 74.1
Vintage clothing 73.1
Event 71.2
Tire 67.6
Formal wear 66.7
Suit 66.7
Classic car 66.1
Stock photography 65.3
Family car 64.7
Plant 64.1
History 63.7

Microsoft
created on 2021-12-14

person 98.3
outdoor 97.9
text 97.7
wedding dress 94.7
dress 93.2
clothing 93.1
bride 92.9
man 91.2
woman 87.4
standing 86.5
posing 85.6
group 75.7
car 70.3
suit 70
smile 61.9
vehicle 59.8
wedding 58.6
old 55.7
people 55.4
black and white 54.9
clothes 22.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 96.9%
Happy 99.4%
Angry 0.2%
Surprised 0.1%
Fear 0.1%
Calm 0.1%
Sad 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 50-68
Gender Male, 53.3%
Calm 96.8%
Sad 1.7%
Angry 0.6%
Happy 0.4%
Surprised 0.3%
Disgusted 0.1%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 37-55
Gender Female, 93.1%
Happy 93.6%
Calm 5.9%
Disgusted 0.1%
Surprised 0.1%
Sad 0.1%
Angry 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 22-34
Gender Female, 97.8%
Happy 63.6%
Calm 28.7%
Surprised 1.8%
Disgusted 1.8%
Confused 1.1%
Fear 1.1%
Angry 1.1%
Sad 0.9%

AWS Rekognition

Age 33-49
Gender Male, 99.7%
Happy 99.6%
Calm 0.1%
Angry 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 51-69
Gender Female, 98%
Happy 97.6%
Calm 2.3%
Sad 0%
Angry 0%
Surprised 0%
Confused 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 50-68
Gender Female, 98%
Calm 98.5%
Sad 1.2%
Happy 0.1%
Angry 0.1%
Surprised 0%
Fear 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 42-60
Gender Male, 99.3%
Happy 73.8%
Calm 23.9%
Sad 0.7%
Confused 0.5%
Angry 0.4%
Surprised 0.4%
Disgusted 0.2%
Fear 0.1%

Microsoft Cognitive Services

Age 45
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Wheel 84%

Text analysis

Amazon

P8
77%
73

Google

70 PE
70
PE