Human Generated Data

Title

Untitled (portrait of three women, Dallas, Texas)

Date

c. 1930s, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.41

Human Generated Data

Title

Untitled (portrait of three women, Dallas, Texas)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1930s, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.41

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.8
Human 99.8
Person 99.5
Person 99.4
Face 98.4
Collage 90.6
Advertisement 90.6
Poster 90.6
Head 88.9
Monitor 88.4
Electronics 88.4
Screen 88.4
Display 88.4
Photo Booth 84.1
Person 68.6
Text 64.8
People 60.5
Portrait 60.2
Photography 60.2
Photo 60.2
Smile 59.5
Performer 57.6

Clarifai
created on 2023-10-25

portrait 99.8
people 99.6
group 99.3
woman 98.7
adult 98
three 96.7
offspring 95.9
man 94.4
family 94.2
two 94.1
monochrome 93.3
four 91.7
son 91.5
girl 90.7
documentary 90.3
actress 89.7
child 89.4
collage 89.1
retro 87.9
sibling 87.7

Imagga
created on 2021-12-14

groom 100
people 36.3
happy 32
adult 31.7
person 28.6
smiling 28.2
couple 27.9
man 25.5
portrait 23.9
women 22.9
attractive 22.4
male 21.3
business 21.3
businessman 20.3
group 20.1
pretty 19.6
happiness 19.6
smile 19.2
love 18.9
office 18.5
suit 18
two 17.8
businesswoman 17.3
corporate 17.2
cheerful 17.1
fashion 16.6
professional 16
looking 16
holding 15.7
confident 15.5
sitting 15.5
men 15.5
businesspeople 15.2
face 14.9
together 14.9
bride 14.8
dress 14.5
success 13.7
laptop 13.7
executive 12.9
computer 12.8
mother 12.7
work 12.6
handsome 12.5
wedding 12
20s 11.9
communication 11.8
joy 11.7
lifestyle 11.6
working 11.5
cute 11.5
tie 11.4
lady 11.4
boy 11.3
elegance 10.9
family 10.7
job 10.6
collar 10.5
bouquet 10.4
teamwork 10.2
casual 10.2
model 10.1
successful 10.1
parent 10.1
team 9.9
black 9.7
partner 9.7
sexy 9.6
married 9.6
brunette 9.6
formal 9.5
hair 9.5
adults 9.5
meeting 9.4
smart 9.4
expression 9.4
relationship 9.4
positive 9.2
child 9.2
modern 9.1
worker 8.9
indoors 8.8
home 8.8
colleagues 8.7
husband 8.6
wife 8.5
career 8.5
females 8.5
studio 8.4
occupation 8.2
standing 7.8
kin 7.8
30s 7.7
partnership 7.7
marriage 7.6
desk 7.6
pair 7.6
togetherness 7.6
friends 7.5
human 7.5
fun 7.5
technology 7.4
style 7.4
teenager 7.3
sensuality 7.3
student 7.2
stylish 7.2
romance 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

human face 99.4
text 98.5
monitor 98.4
smile 95.4
clothing 93.6
woman 90.8
picture frame 87.6
television 87.1
person 85.5
posing 82.3
screen 80.3
poster 50.1
screenshot 24.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-39
Gender Female, 98%
Calm 75.4%
Happy 20.8%
Sad 1.3%
Surprised 0.9%
Disgusted 0.4%
Fear 0.4%
Angry 0.4%
Confused 0.4%

AWS Rekognition

Age 11-21
Gender Female, 99.2%
Calm 94.1%
Happy 3.5%
Surprised 0.9%
Sad 0.7%
Angry 0.3%
Confused 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-35
Gender Female, 99.9%
Happy 98.4%
Surprised 0.8%
Calm 0.4%
Fear 0.1%
Confused 0.1%
Disgusted 0.1%
Sad 0.1%
Angry 0.1%

Microsoft Cognitive Services

Age 39
Gender Female

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 15
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Monitor 88.4%

Categories

Text analysis

Amazon

"
312A