Human Generated Data

Title

Untitled (black family looking on at casket)

Date

n.d.

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3569

Human Generated Data

Title

Untitled (black family looking on at casket)

People

Artist: Unidentified Artist,

Date

n.d.

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 99.9
Apparel 99.9
Person 98.1
Human 98.1
Person 96.4
Person 96
Plant 93.6
Overcoat 92.8
Coat 92.8
Accessories 90.4
Accessory 90.4
Tie 90.4
Flower 86.9
Blossom 86.9
Tie 85.5
Suit 82.8
Flower Arrangement 80.8
Flower Bouquet 76.1
Robe 69.3
Fashion 69.3
Furniture 67.6
Tuxedo 67
Gown 64.8
Funeral 56.5
Person 55.4

Imagga
created on 2022-01-08

man 40.3
person 36.2
male 32.6
business 27.9
people 27.9
office 25.8
businessman 24.7
sitting 21.5
table 20
work 19.9
adult 19
laptop 18.2
computer 17.8
room 17.3
professional 17
smiling 16.6
corporate 16.3
men 16.3
desk 16.2
worker 16.1
job 15.9
suit 15.9
smile 15.7
working 15
stretcher 14.6
happy 14.4
casual 14.4
lifestyle 13.7
group 12.9
black 12
teacher 11.9
technology 11.9
nurse 11.8
litter 11.6
indoors 11.4
meeting 11.3
team 10.7
to 10.6
chair 10.6
home 10.4
mature 10.2
communication 10.1
board 9.9
executive 9.9
handsome 9.8
cheerful 9.8
standing 9.6
education 9.5
color 9.5
coat 9.4
pen 9.4
paper 9.4
manager 9.3
coffee 9.3
life 9.1
portrait 9.1
classroom 9
conveyance 8.9
interior 8.8
looking 8.8
together 8.8
jacket 8.7
senior 8.4
hand 8.4
successful 8.2
indoor 8.2
success 8
women 7.9
love 7.9
day 7.8
couple 7.8
old 7.7
serious 7.6
two 7.6
keyboard 7.5
shirt 7.5
outdoors 7.5
holding 7.4
teamwork 7.4
phone 7.4
occupation 7.3
20s 7.3
patient 7.3
musical instrument 7.2
school 7.2
family 7.1
notebook 7.1
student 7.1
happiness 7
modern 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 99.5
man 94.1
text 88.7
clothing 83.3
funeral 82.2
furniture 81
white 77.4
standing 76.1
posing 58.6
old 54.8
black and white 52.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 39-47
Gender Female, 100%
Sad 70.9%
Calm 24.7%
Fear 1.7%
Angry 0.8%
Confused 0.8%
Disgusted 0.7%
Happy 0.2%
Surprised 0.2%

AWS Rekognition

Age 29-39
Gender Male, 100%
Calm 88.2%
Sad 9.3%
Confused 1.3%
Angry 0.4%
Fear 0.3%
Surprised 0.2%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 49-57
Gender Male, 99.7%
Calm 97.4%
Surprised 0.5%
Angry 0.5%
Sad 0.5%
Disgusted 0.3%
Confused 0.3%
Happy 0.3%
Fear 0.2%

AWS Rekognition

Age 42-50
Gender Male, 98.9%
Calm 74.4%
Sad 10.7%
Angry 9.3%
Disgusted 1.9%
Surprised 1.2%
Fear 1.1%
Confused 1%
Happy 0.4%

AWS Rekognition

Age 23-33
Gender Male, 99.5%
Calm 99.8%
Disgusted 0.1%
Sad 0.1%
Happy 0%
Confused 0%
Fear 0%
Angry 0%
Surprised 0%

Microsoft Cognitive Services

Age 30
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%
Tie 90.4%
Suit 82.8%

Captions

Microsoft

a person standing in front of a group of people posing for a photo 80.7%
a person standing in front of a group of people posing for the camera 80.6%
a person standing in front of a group of people posing for a picture 80.5%