Human Generated Data

Title

Elizabeth Catlett

Date

2003

People

Artist: Nancy Lee Katz, American 1947 - 2018

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Michael S. Sachs, 2021.436

Copyright

© Michael S. Sachs

Human Generated Data

Title

Elizabeth Catlett

People

Artist: Nancy Lee Katz, American 1947 - 2018

Date

2003

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Michael S. Sachs, 2021.436

Copyright

© Michael S. Sachs

Machine Generated Data

Tags

Amazon
created on 2022-12-12

Portrait 100
Photography 100
Head 100
Face 100
Art 99.9
Painting 99.9
Person 99.6
Man 99.6
Adult 99.6
Male 99.6
Finger 98.5
Hand 98.5
Body Part 98.5
Glasses 96.9
Accessories 96.9
Person 91.6
Person 89.6
Woman 89.6
Adult 89.6
Female 89.6
Person 87.7
Adult 87.7
Man 87.7
Male 87.7
Drawing 75
T-Shirt 68.1
Clothing 68.1

Clarifai
created on 2023-10-13

people 99.9
adult 99.5
portrait 98.9
one 98.4
two 98.2
man 97.8
elderly 96.8
writer 96.4
monochrome 95.1
furniture 91.9
facial expression 90.8
music 90.8
woman 90.8
scientist 90.1
art 89.3
book series 88
administration 87.1
room 85.1
sit 84.4
concentration 82.3

Imagga
created on 2022-12-12

man 37
person 35
people 29.6
male 29.1
home 26.3
smiling 26
happy 24.4
adult 23.8
senior 23.4
office 22.5
grandma 22.4
work 20.4
indoors 18.5
lifestyle 18.1
computer 17.8
smile 17.1
business 17
attractive 16.8
sitting 16.3
portrait 16.2
looking 16
couple 15.7
table 15.6
laptop 15.4
desk 15.1
mature 14.9
expression 14.5
elderly 14.4
businessman 14.1
professional 13.6
kitchen 13.4
handsome 13.4
job 13.3
interior 13.3
working 13.3
cheerful 12.2
glass 12.1
glasses 12
device 12
businesswoman 11.8
family 11.6
holding 11.6
husband 11.5
face 11.4
education 11.3
corporate 11.2
old 11.1
worker 11.1
wine 10.8
cooking utensil 10.6
women 10.3
happiness 10.2
spatula 9.9
grandmother 9.8
disk jockey 9.7
appliance 9.7
food 9.7
waiter 9.7
technology 9.6
together 9.6
cooking 9.6
boy 9.6
wife 9.5
two 9.3
dinner 9.3
eating 9.3
restaurant 9.2
cup 9.2
house 9.2
bartender 9.1
room 9.1
pretty 9.1
machine 9
one 9
meal 8.9
employee 8.9
wineglass 8.8
child 8.7
concentration 8.7
retirement 8.6
men 8.6
communication 8.4
help 8.4
hand 8.4
drink 8.4
student 8.3
indoor 8.2
turner 7.9
look 7.9
lunch 7.8
hands 7.8
broadcaster 7.8
container 7.6
director 7.6
life 7.5
leisure 7.5
study 7.5
coffee 7.4
inside 7.4
successful 7.3
coat 7.3
confident 7.3
success 7.2
sewing machine 7.2
home appliance 7.1
hair 7.1
love 7.1
kitchen utensil 7

Google
created on 2022-12-12

Microsoft
created on 2022-12-12

man 99.1
person 98.8
wall 98.5
text 96.2
indoor 94.7
human face 94.2
black and white 93.5
book 91.3
clothing 80.6
glasses 52.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 72-82
Gender Male, 98.8%
Calm 98.5%
Surprised 6.5%
Fear 6%
Sad 2.2%
Disgusted 0.2%
Confused 0.2%
Angry 0.2%
Happy 0.1%

Microsoft Cognitive Services

Age 71
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Man 99.6%
Adult 99.6%
Male 99.6%
Glasses 96.9%
Woman 89.6%
Female 89.6%

Categories

Text analysis

Amazon

Figure
Form,
Gesture,

Google

UITI
Figure UITI
Figure