Human Generated Data

Title

Omar, West Virginia

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3085

Human Generated Data

Title

Omar, West Virginia

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3085

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.9
Human 99.9
Person 99.8
Person 99.8
Person 99.7
Person 99.4
Person 99.3
Urban 93.8
People 93.5
Clothing 93.4
Apparel 93.4
Building 85.4
Shorts 73.1
Family 72.1
Outdoors 67.8
Nature 67.6
Skirt 57.3
Slum 55.2

Clarifai
created on 2023-10-15

people 100
group 99.6
child 99.4
group together 99.4
three 99.2
portrait 97.8
four 97.3
man 97.2
adult 97
family 95.9
five 95.6
war 95.4
sibling 95.3
boy 94.9
woman 94.7
son 93.9
several 93.8
administration 93.4
two 91.7
documentary 91.4

Imagga
created on 2021-12-15

man 39
people 31.8
nurse 30.6
male 28.5
uniform 26.9
person 26.8
hospital 26.7
doctor 24.4
innocent 23.7
group 22.6
medical 22.1
team 21.5
happy 21.3
smile 18.5
medicine 17.6
child 17.1
professional 16.7
smiling 16.6
military uniform 16.5
standing 16.5
men 16.3
health 16
private 15.8
care 15.6
couple 14.8
business 14.6
teamwork 13.9
adult 13.9
clothing 13.8
outdoors 13.4
boy 13
portrait 12.9
student 12.9
stethoscope 12.8
women 12.6
staff 12.6
worker 12.5
together 12.3
outside 12
family 11.6
job 11.5
walk 11.4
walking 11.4
work 11
school 10.9
doctors 10.8
world 10.6
businessman 10.6
coat 10.4
occupation 10.1
kin 10.1
working 9.7
diversity 9.6
education 9.5
day 9.4
lifestyle 9.4
two 9.3
outdoor 9.2
confident 9.1
holding 9.1
old 9.1
black 9
handsome 8.9
mother 8.9
30s 8.7
confidence 8.6
mature 8.4
human 8.2
to 8
building 7.8
happiness 7.8
partnership 7.7
clinic 7.7
illness 7.6
healthy 7.6
friendship 7.5
teacher 7.4
camera 7.4
street 7.4
friendly 7.3
home 7.2

Google
created on 2021-12-15

Clothing 98.7
Dress 85.9
Building 80.1
Plant 79.6
Adaptation 79.4
People 78
House 76.9
Vintage clothing 76.4
Window 76.2
Toddler 71.5
Child 69.5
Monochrome 68.7
Room 68.6
History 67.1
Monochrome photography 62.7
Paper product 58.9
Uniform 56.8
Photo caption 56.1
Landscape 53.3
Retro style 52.6

Microsoft
created on 2021-12-15

text 99.7
clothing 99.2
person 98.9
outdoor 95.6
grass 95.6
child 94.9
toddler 94.9
baby 94.8
smile 93.7
boy 93.4
standing 93.1
human face 92.1
posing 91.8
house 80.3
group 69.2
old 59
man 56.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 7-17
Gender Female, 96.1%
Calm 85.6%
Happy 10.9%
Sad 1.5%
Angry 0.9%
Disgusted 0.5%
Confused 0.2%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 20-32
Gender Female, 99.6%
Calm 93.7%
Happy 2.8%
Sad 2.3%
Confused 0.3%
Angry 0.3%
Surprised 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 22-34
Gender Female, 95.7%
Calm 49.8%
Happy 36.4%
Angry 8.4%
Confused 1.5%
Surprised 1.4%
Sad 1.3%
Disgusted 0.9%
Fear 0.3%

AWS Rekognition

Age 13-23
Gender Female, 95%
Happy 65.5%
Confused 16%
Calm 14.5%
Surprised 2.8%
Sad 0.4%
Disgusted 0.3%
Fear 0.2%
Angry 0.2%

AWS Rekognition

Age 4-12
Gender Male, 90.3%
Calm 95.2%
Sad 3.1%
Happy 0.9%
Angry 0.5%
Confused 0.1%
Surprised 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 20-32
Gender Female, 99.4%
Calm 62.2%
Sad 12.5%
Confused 7.5%
Angry 5.7%
Happy 3.8%
Surprised 3.7%
Fear 2.9%
Disgusted 1.7%

Microsoft Cognitive Services

Age 13
Gender Female

Microsoft Cognitive Services

Age 31
Gender Female

Microsoft Cognitive Services

Age 11
Gender Female

Microsoft Cognitive Services

Age 7
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%

Categories

Imagga

people portraits 97.4%
paintings art 1.8%