Human Generated Data

Title

Indiana State Fair, Indianapolis

Date

1973

People

Artist: William Carter, American born 1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.170

Human Generated Data

Title

Indiana State Fair, Indianapolis

People

Artist: William Carter, American born 1934

Date

1973

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.5
Human 99.5
Person 99.5
Person 98.8
Person 98.7
Person 96.6
Person 92.1
Person 82.5
Person 74.1
Leisure Activities 72.5
Musical Instrument 72.5
Guitar 72.5
Apparel 68.1
Clothing 68.1
Chess 66.2
Game 66.2
Shoe 65.6
Footwear 65.6
Shoe 65.2
Shorts 61.5
Pub 55.8

Imagga
created on 2021-12-14

person 28
people 27.3
adult 27
man 26.2
fashion 25.6
dress 24.4
clothing 24.1
happy 23.2
male 22.7
portrait 19.4
couple 19.2
boutique 18.5
outfit 18
lifestyle 16.6
wind instrument 16.6
life 16.5
casual 16.1
shop 16.1
happiness 15.7
attractive 15.4
love 15
black 14.6
smiling 14.5
style 14.1
musical instrument 13.9
brass 13.8
shopping 13.8
two 13.5
business 13.4
clothes 13.1
standing 13
looking 12.8
store 12.3
urban 12.2
pretty 11.9
women 11.9
elegance 11.7
model 11.7
city 11.6
handsome 11.6
smile 11.4
room 11.1
trombone 10.9
suit 10.9
garment 10.8
cheerful 10.6
lady 10.5
together 10.5
sexy 10.4
indoor 10
leisure 10
modern 9.8
interior 9.7
family 8.9
body 8.8
indoors 8.8
mall 8.8
bride 8.6
men 8.6
wall 8.5
retail 8.5
buy 8.4
fun 8.2
one 8.2
home 8
active 7.9
face 7.8
party 7.7
elegant 7.7
expression 7.7
performer 7.7
hand 7.6
bouquet 7.5
human 7.5
holding 7.4
event 7.4
wedding 7.4
professional 7.3
group 7.2
gorgeous 7.2
celebration 7.2
cute 7.2
hair 7.1
posing 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

person 99.1
clothing 98.2
footwear 89.1
man 82.1
text 80.3
woman 80
people 59
black and white 56.2
line 20
clothes 15
several 11.4

Face analysis

Amazon

Google

AWS Rekognition

Age 5-15
Gender Female, 94.8%
Sad 76.4%
Fear 17.1%
Calm 5.4%
Happy 0.3%
Angry 0.3%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 37-55
Gender Male, 61.1%
Calm 98.7%
Sad 1%
Happy 0.1%
Angry 0.1%
Surprised 0.1%
Confused 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 28-44
Gender Male, 77.5%
Sad 58.7%
Calm 39.4%
Confused 0.8%
Happy 0.7%
Angry 0.3%
Surprised 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 6-16
Gender Female, 89.6%
Calm 54.6%
Sad 32.2%
Fear 7.6%
Angry 3.4%
Happy 0.8%
Surprised 0.6%
Confused 0.4%
Disgusted 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Guitar 72.5%
Chess 66.2%
Shoe 65.6%

Captions

Microsoft

a group of people standing in front of a crowd 91%
a group of people standing in a room 90.9%
a group of people standing in front of a crowd of people 89.7%