Human Generated Data

Title

Untitled (Marked Tree, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1210

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Marked Tree, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1210

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.9
Shirt 99.9
Adult 99.7
Male 99.7
Man 99.7
Person 99.7
Accessories 99.6
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Photography 98.4
Face 98.1
Head 98.1
Portrait 98.1
Formal Wear 94.7
Tie 94.7
Wristwatch 87.1
Suspenders 76
Bag 65.2
Vest 65.1
Electrical Device 57.9
Microphone 57.9
Handbag 57.9
Coat 57.7
Body Part 57.4
Neck 57.4
Blouse 57.2
People 57
Hat 56.7
Lady 55.3
Jewelry 55.1
Necklace 55.1

Clarifai
created on 2018-05-11

people 99.9
portrait 98.3
adult 97.1
wear 96.3
man 96.2
one 96.2
administration 92.9
two 92.6
group 90.3
actor 90.2
uniform 89.9
outfit 89.3
music 89.1
facial expression 87.6
military 87.4
war 85.5
three 85
group together 84.8
leader 84.7
child 84.5

Imagga
created on 2023-10-06

nurse 56
person 38.5
doctor 36.7
man 36.3
male 35.6
medical 34.5
professional 29.8
hospital 29.6
people 27.9
adult 26.4
health 25.7
medicine 25.6
smile 25
coat 24.6
portrait 24
clinic 22.1
stethoscope 21.2
smiling 21
happy 20.1
men 19.8
lab coat 19.5
mature 18.6
attractive 18.2
uniform 17.9
face 17.8
care 17.3
senior 16.9
occupation 16.5
office 16.1
work 15.7
specialist 15.4
handsome 15.2
job 14.2
business 14
physician 13.7
confident 13.7
elderly 13.4
patient 13.4
working 13.3
standing 13.1
lifestyle 13
clothing 12.7
student 12.2
practitioner 12.1
looking 12
garment 12
doctors 11.8
surgeon 11.1
worker 11
waiter 10.8
lab 10.7
couple 10.5
tie 10.5
women 10.3
friendly 10.1
necktie 10
fashion 9.8
staff 9.8
human 9.8
middle aged 9.7
businessman 9.7
sexy 9.6
exam 9.6
serious 9.5
model 9.3
guy 9.3
casual 9.3
glasses 9.3
old 9.1
one 9
success 8.9
clinical 8.8
indoors 8.8
expertise 8.7
dining-room attendant 8.7
looking camera 8.7
profession 8.6
illness 8.6
career 8.5
black 8.4
studio 8.4
suit 8.2
lady 8.1
family 8
home 8
cute 7.9
together 7.9
intern 7.9
color 7.8
corporate 7.7
sitting 7.7
youth 7.7
shirt 7.5
holding 7.4
camera 7.4
treatment 7.4
employee 7.3
aged 7.2
blond 7.2
hair 7.1
love 7.1
modern 7
look 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
old 53.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-14
Gender Female, 80.4%
Calm 85.6%
Sad 8.4%
Fear 6.6%
Surprised 6.4%
Angry 0.7%
Disgusted 0.7%
Confused 0.5%
Happy 0.3%

AWS Rekognition

Age 23-31
Gender Male, 100%
Calm 98.3%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Confused 0.6%
Angry 0.2%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 37-45
Gender Male, 64.1%
Calm 97.5%
Surprised 6.3%
Fear 5.9%
Sad 2.6%
Angry 0.3%
Confused 0.2%
Disgusted 0.2%
Happy 0.1%

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 37
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.7%
Male 99.7%
Man 99.7%
Person 99.7%
Wristwatch 87.1%

Categories

Imagga

paintings art 58.4%
people portraits 41.3%

Text analysis

Amazon

FREE
somid