Human Generated Data

Title

Untitled (Dorothy Spivack, Hong Kong)

Date

March 3, 1960-March 13, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5406

Human Generated Data

Title

Untitled (Dorothy Spivack, Hong Kong)

People

Artist: Ben Shahn, American 1898 - 1969

Date

March 3, 1960-March 13, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5406

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 100
Head 100
Photography 100
Portrait 100
Body Part 100
Finger 100
Hand 100
Person 99.8
Adult 99.8
Female 99.8
Woman 99.8
Accessories 91.6
Glasses 91.6
Neck 64.1
Shoulder 56.7
Blouse 55.1
Clothing 55.1

Clarifai
created on 2018-05-10

people 99.6
adult 99
one 98.8
portrait 98.6
man 95
wear 88.9
monochrome 88.3
facial expression 88
indoors 87.6
profile 86.3
woman 86.1
side view 85.8
sit 81.5
window 81.5
sadness 81
administration 78.8
music 76.1
actor 75.8
elderly 75.2
leader 71.3

Imagga
created on 2023-10-06

person 34.5
man 30.9
male 29.7
portrait 27.2
people 26.8
adult 26.6
face 21.3
happy 20.1
smile 18.5
attractive 18.2
smiling 18.1
looking 17.6
hair 17.5
phone 16.6
one 16.4
child 14.9
businessman 13.3
love 12.6
serious 12.4
call 12.3
office 12.3
business 12.2
women 11.9
grandma 11.7
handsome 11.6
hand 11.4
lady 11.4
groom 11.3
sexy 11.3
shirt 11.2
husband 10.7
holding 10.7
human 10.5
brunette 10.5
couple 10.5
sitting 10.3
close 10.3
casual 10.2
cute 10.1
pretty 9.8
black 9.6
talk 9.6
world 9.6
work 9.5
mobile 9.4
lifestyle 9.4
mature 9.3
telephone 9.2
girls 9.1
fashion 9.1
director 8.9
working 8.8
depression 8.8
happiness 8.6
men 8.6
talking 8.6
expression 8.5
youth 8.5
head 8.4
outdoors 8.2
relaxing 8.2
cheerful 8.1
home 8
suit 7.8
mother 7.8
married 7.7
old 7.7
thinking 7.6
communication 7.6
senior 7.5
hold 7.4
student 7.3
computer 7.2
look 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.9
wall 98.2
man 97.5
indoor 95.3
window 87.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Female, 99.7%
Sad 47.4%
Confused 41.7%
Calm 17.7%
Surprised 8.1%
Fear 7.1%
Disgusted 4.1%
Angry 2.9%
Happy 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Adult 99.8%
Female 99.8%
Woman 99.8%
Glasses 91.6%

Categories