Human Generated Data

Title

Tupelo Women's Auxilary

Date

1950s

People

Artist: Terry Wood, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.675

Human Generated Data

Title

Tupelo Women's Auxilary

People

Artist: Terry Wood, American 20th century

Date

1950s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.4
Person 99.4
Person 99.3
Person 98.9
Person 98.6
Person 98.5
Person 98
Person 97.7
Person 97.1
Person 96.9
Person 95.5
Person 95.3
Person 95.2
Person 93.7
Person 93.1
Ceiling Fan 88.3
Appliance 88.3
Indoors 87
Room 87
Clinic 86.4
Person 85.7
Person 82.3
People 78.7
Person 73.9
Furniture 73.6
Flooring 72.7
Person 72.3
Hospital 55.9
Operating Theatre 55.9

Imagga
created on 2022-01-08

kin 35.9
people 30.1
teacher 26.9
adult 24.8
person 24.1
shop 23.9
man 23.5
interior 23
educator 20.5
room 20.4
women 19
dress 19
professional 18.2
happy 16.9
home 16.7
indoors 16.7
male 16.4
portrait 16.2
couple 14.8
indoor 14.6
bride 14.4
happiness 14.1
men 13.7
love 13.4
mercantile establishment 13.4
family 13.3
shoe shop 13.1
wedding 12.9
celebration 12.7
two 12.7
lifestyle 12.3
fashion 12.1
groom 11.6
smile 11.4
new 11.3
chair 11.3
smiling 10.8
old 10.4
standing 10.4
bouquet 10.4
business 10.3
clothing 10
girls 10
house 10
married 9.6
day 9.4
window 9.3
elegance 9.2
traditional 9.1
lady 8.9
place of business 8.8
holiday 8.6
luxury 8.6
casual 8.5
city 8.3
inside 8.3
style 8.2
group 8.1
romantic 8
life 8
decoration 8
together 7.9
table 7.8
mother 7.8
husband 7.7
sitting 7.7
pretty 7.7
attractive 7.7
loving 7.6
marriage 7.6
senior 7.5
clothes 7.5
holding 7.4
tradition 7.4
cheerful 7.3
furniture 7.2
color 7.2
activity 7.2
wooden 7

Google
created on 2022-01-08

Picture frame 94.9
Dress 87.5
Style 83.9
Suit 81.9
Multimedia projector 80.2
Chair 80.1
Art 76.3
Table 75
Event 74.2
Vintage clothing 74
Window 73.9
Classic 72.9
Monochrome 72.2
Monochrome photography 70.5
Room 70.1
Class 68.2
Couch 67.2
History 65.5
Stock photography 62.9
Visual arts 60.5

Microsoft
created on 2022-01-08

floor 99.6
indoor 97.6
person 93
clothing 87.5
text 79.6
white 66.9
woman 66.4
table 62.9
family 55.6
old 41.2
vintage 25.9
furniture 17.4
several 15.5

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Female, 100%
Calm 73.9%
Happy 11.5%
Sad 9.5%
Fear 1.9%
Surprised 1.2%
Angry 1.1%
Disgusted 0.5%
Confused 0.4%

AWS Rekognition

Age 24-34
Gender Female, 100%
Sad 32.8%
Surprised 23.6%
Angry 10.6%
Calm 9.5%
Happy 8.6%
Fear 7.6%
Disgusted 6.2%
Confused 1%

AWS Rekognition

Age 29-39
Gender Female, 100%
Calm 83.2%
Surprised 4.6%
Sad 4.3%
Angry 3.5%
Confused 2.5%
Disgusted 1.1%
Fear 0.5%
Happy 0.3%

AWS Rekognition

Age 30-40
Gender Female, 99.7%
Happy 28.7%
Calm 26.7%
Sad 20.2%
Confused 7.7%
Angry 6.1%
Surprised 4.5%
Fear 4.1%
Disgusted 2%

AWS Rekognition

Age 40-48
Gender Female, 100%
Calm 99.9%
Angry 0%
Sad 0%
Disgusted 0%
Confused 0%
Surprised 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 48-54
Gender Female, 99.6%
Calm 95.6%
Sad 1.3%
Fear 1.1%
Happy 1.1%
Surprised 0.3%
Angry 0.3%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 36-44
Gender Female, 100%
Calm 98%
Sad 0.6%
Surprised 0.3%
Fear 0.3%
Angry 0.3%
Happy 0.2%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 20-28
Gender Female, 99.8%
Calm 50.3%
Happy 16.1%
Surprised 9.8%
Sad 8.1%
Confused 6.4%
Fear 6%
Angry 2%
Disgusted 1.3%

AWS Rekognition

Age 19-27
Gender Female, 100%
Surprised 43.5%
Happy 26.5%
Fear 12.7%
Confused 11.5%
Calm 1.9%
Angry 1.8%
Disgusted 1.3%
Sad 0.9%

AWS Rekognition

Age 43-51
Gender Female, 100%
Happy 78.6%
Calm 7.7%
Surprised 4.1%
Sad 3.2%
Angry 2%
Confused 1.6%
Fear 1.5%
Disgusted 1.4%

AWS Rekognition

Age 54-64
Gender Female, 100%
Sad 55.2%
Happy 20.5%
Confused 7.2%
Calm 6.1%
Surprised 4.1%
Angry 2.8%
Fear 2.2%
Disgusted 2%

AWS Rekognition

Age 27-37
Gender Female, 100%
Calm 76.8%
Fear 14%
Surprised 4.2%
Sad 1.4%
Confused 1.4%
Disgusted 0.8%
Angry 0.7%
Happy 0.5%

AWS Rekognition

Age 21-29
Gender Male, 89.8%
Angry 41.9%
Calm 29.5%
Sad 14.5%
Surprised 4.9%
Fear 4%
Confused 3.2%
Disgusted 1.7%
Happy 0.3%

AWS Rekognition

Age 35-43
Gender Female, 67.3%
Sad 45.9%
Calm 36.2%
Confused 11.5%
Angry 2.4%
Fear 1.4%
Disgusted 0.9%
Surprised 0.9%
Happy 0.7%

AWS Rekognition

Age 47-53
Gender Female, 100%
Sad 39.6%
Calm 18.3%
Fear 14.8%
Surprised 10.7%
Confused 4.9%
Angry 4.1%
Happy 3.9%
Disgusted 3.6%

AWS Rekognition

Age 23-31
Gender Male, 70.7%
Calm 60.4%
Surprised 14.9%
Fear 11.4%
Confused 4.3%
Sad 4.2%
Happy 2.3%
Angry 1.4%
Disgusted 1%

AWS Rekognition

Age 51-59
Gender Female, 61.1%
Calm 98.1%
Angry 0.4%
Disgusted 0.3%
Confused 0.3%
Surprised 0.3%
Sad 0.3%
Fear 0.2%
Happy 0%

AWS Rekognition

Age 25-35
Gender Male, 92%
Happy 97.6%
Angry 1%
Surprised 0.7%
Calm 0.3%
Sad 0.2%
Fear 0.1%
Disgusted 0.1%
Confused 0%

AWS Rekognition

Age 31-41
Gender Female, 87.4%
Calm 60.7%
Angry 26.5%
Surprised 4.8%
Sad 3.2%
Fear 1.6%
Happy 1.3%
Disgusted 1%
Confused 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Ceiling Fan 88.3%

Captions

Microsoft

a vintage photo of a group of people in a room 95.1%
a vintage photo of a group of people standing in a room 93.1%
a vintage photo of some people in a room 93%