Human Generated Data

Title

Untitled (portrait of woman and girl with dog)

Date

c. 1950

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21814

Human Generated Data

Title

Untitled (portrait of woman and girl with dog)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21814

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 98.8
Human 98.8
Clothing 97.4
Apparel 97.4
Person 96.3
People 85.4
Person 82.8
Female 77.1
Toy 68.6
Shoe 67.5
Footwear 67.5
Girl 64.2
Face 62.1
Woman 60.5
Doll 59.8
Long Sleeve 58
Sleeve 58
Chair 57.5
Furniture 57.5
Hair 55.4
Figurine 55.1

Clarifai
created on 2023-10-22

people 99.8
adult 98
group 97.8
woman 97.3
two 96.7
man 95.3
monochrome 93.6
wear 92.8
wedding 91.7
three 89.7
outfit 88.3
interaction 88.1
actor 87.6
actress 86.9
family 85.2
leader 84.5
child 84.4
veil 83.2
portrait 82.1
group together 82.1

Imagga
created on 2022-03-11

person 37.8
people 25.1
man 22.9
adult 22.8
male 22.7
portrait 21.3
black 16.3
professional 16
planner 15.5
medical 15
happy 13.8
business 13.4
men 12.9
clothing 12.9
coat 12.8
automaton 12.8
attractive 12.6
health 12.5
businessman 12.4
standing 12.2
hair 11.9
nurse 11.5
lady 11.4
human 11.2
one 11.2
style 11.1
happiness 11
couple 10.4
looking 10.4
healthy 10.1
smile 10
dress 9.9
pretty 9.8
medicine 9.7
art 9.6
body 9.6
bride 9.6
home 9.6
work 9.5
smiling 9.4
doctor 9.4
lifestyle 9.4
camera 9.2
alone 9.1
bouquet 9.1
holding 9.1
fashion 9
brass 9
handsome 8.9
office 8.8
hospital 8.8
vertical 8.7
thinking 8.5
dark 8.3
occupation 8.2
pose 8.2
student 8.1
stylish 8.1
outfit 8.1
uniform 8.1
interior 8
worker 7.8
face 7.8
color 7.8
model 7.8
wind instrument 7.7
suit 7.7
care 7.4
friendly 7.3
room 7.3
confident 7.3
exercise 7.3
success 7.2
costume 7.2
job 7.1
indoors 7
life 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

wall 99.4
text 99.2
clothing 95.6
person 91.5
dress 89.9
human face 89.2
woman 63.8
smile 60.2
wedding dress 53.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 77.4%
Happy 68.2%
Calm 28%
Sad 1.5%
Confused 0.6%
Disgusted 0.6%
Surprised 0.5%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 31-41
Gender Female, 98.2%
Calm 65.6%
Happy 28.1%
Sad 2%
Disgusted 1.1%
Surprised 1.1%
Fear 1%
Confused 0.7%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 98.8%
Person 96.3%
Person 82.8%
Shoe 67.5%

Categories

Imagga

paintings art 99.7%

Text analysis

Amazon

at

Google

ar MAOUN-YTERA2 -NAMT2AS
ar
MAOUN-YTERA2
-NAMT2AS