Human Generated Data

Title

Untitled (throwing rice on newleyweds)

Date

1955

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19824

Human Generated Data

Title

Untitled (throwing rice on newleyweds)

People

Artist: Ken Whitmire Associates, American

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19824

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.3
Human 99.3
Person 99.3
Person 99
Clothing 98.5
Apparel 98.5
Person 97.2
Person 97.2
Person 95.9
Person 95.8
Person 95.4
Person 94
Dress 88.9
People 84.5
Female 83.1
Indoors 82.7
Person 81.1
Room 80.7
Suit 76.5
Overcoat 76.5
Coat 76.5
Person 75.6
Person 69.6
Person 65.7
Woman 65.6
Crowd 62.9
Person 61.4
Person 59.6
Girl 58.5
Party 57.5
Shoe 57.1
Footwear 57.1
Person 53.2

Clarifai
created on 2023-10-22

people 99.9
group 99.3
many 98.7
woman 97.5
group together 96.3
adult 95.3
man 94.3
recreation 92.7
crowd 92.4
administration 90.2
child 89.6
monochrome 89.6
war 87.9
education 85.7
several 85.6
room 84
leader 82.2
police 81.3
school 80.3
military 80

Imagga
created on 2022-03-05

people 31.8
business 24.3
man 22.2
adult 22.2
boutique 21.6
groom 19.7
men 18
male 17.7
businessman 16.8
bride 16.6
couple 16.5
person 16.2
shop 15.4
women 15
wedding 14.7
dress 14.5
urban 14
building 13.9
happy 13.8
group 13.7
kin 13.6
love 13.4
happiness 13.3
city 13.3
clothing 13.1
life 12.8
corporate 12
two 11.9
church 11.1
room 11
professional 10.9
suit 10.8
interior 10.6
barbershop 10.6
travel 10.6
fashion 10.6
indoors 10.5
office 10.5
human 10.5
team 9.9
teacher 9.8
bouquet 9.6
marriage 9.5
work 9.4
architecture 9.4
tourist 9.2
outfit 9
new 8.9
catholic 8.8
celebration 8.8
ceremony 8.7
station 8.7
scene 8.7
married 8.6
husband 8.6
walking 8.5
youth 8.5
clothes 8.4
portrait 8.4
modern 8.4
old 8.4
activity 8.1
success 8
hall 8
family 8
black 7.8
gown 7.8
worker 7.6
businesspeople 7.6
career 7.6
tourism 7.4
teamwork 7.4
street 7.4
time 7.3
lifestyle 7.2
home 7.2
religion 7.2
transportation 7.2
mercantile establishment 7.2
job 7.1
day 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

dress 99.2
person 99.1
wedding dress 98.7
clothing 96.2
bride 95.5
woman 93.5
text 88.6
outdoor 88.4
wedding 70.6
sport 70.3
black and white 60.9
man 59.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 80.1%
Calm 44.4%
Surprised 30.5%
Sad 14.8%
Angry 3.4%
Disgusted 3%
Confused 1.7%
Happy 1.5%
Fear 0.7%

AWS Rekognition

Age 34-42
Gender Female, 62.2%
Sad 42%
Calm 37.6%
Happy 9.7%
Confused 8.3%
Disgusted 0.9%
Angry 0.7%
Fear 0.5%
Surprised 0.3%

AWS Rekognition

Age 30-40
Gender Male, 86.9%
Calm 38.4%
Happy 26.8%
Sad 14.2%
Disgusted 9.8%
Angry 3.8%
Surprised 3.2%
Confused 2.1%
Fear 1.6%

AWS Rekognition

Age 31-41
Gender Female, 78.6%
Sad 96.2%
Confused 1.7%
Fear 0.9%
Happy 0.5%
Disgusted 0.3%
Calm 0.2%
Surprised 0.2%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Female, 97.3%
Sad 58.2%
Calm 20.7%
Confused 7.8%
Happy 5.1%
Angry 3.3%
Disgusted 2%
Fear 2%
Surprised 0.8%

AWS Rekognition

Age 23-31
Gender Female, 62.2%
Sad 44.2%
Calm 28.2%
Confused 13.8%
Happy 5.6%
Fear 4.4%
Disgusted 1.4%
Angry 1.4%
Surprised 0.9%

AWS Rekognition

Age 28-38
Gender Female, 60.4%
Calm 61.2%
Confused 19.4%
Happy 13.3%
Sad 2.9%
Fear 1%
Disgusted 0.9%
Surprised 0.8%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.3%
Person 99.3%
Person 99%
Person 97.2%
Person 97.2%
Person 95.9%
Person 95.8%
Person 95.4%
Person 94%
Person 81.1%
Person 75.6%
Person 69.6%
Person 65.7%
Person 61.4%
Person 59.6%
Person 53.2%
Shoe 57.1%

Text analysis

Amazon

304
5
09
KODAKA-ITW

Google

MJI7--Y T 3RA°2-- XAGO
MJI7--Y
T
3RA°2--
XAGO