Human Generated Data

Title

Untitled (woman in riding clothes seated with legs crossed)

Date

c. 1929

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12689

Human Generated Data

Title

Untitled (woman in riding clothes seated with legs crossed)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1929

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Person 98.9
Human 98.9
Hat 89.6
Clothing 89.6
Apparel 89.6
Nature 75.9
Home Decor 73.9
Outdoors 72.1
Electronics 71.9
Screen 71.9
Sleeve 70.7
Monitor 60.3
Display 60.3
Bowl 58.3
LCD Screen 57.6
Photography 56.4
Photo 56.4
Indoors 55.8
Room 55.8

Imagga
created on 2022-02-04

person 37
man 28.2
male 27.7
people 25.6
adult 24.1
portrait 21.3
bride 19
love 18.1
television 18.1
dress 18.1
happiness 17.2
wedding 16.5
sitting 16.3
cheerful 16.2
clothing 15.3
business 15.2
businessman 15
groom 15
laptop 15
gown 14.6
happy 14.4
computer 14
professional 13.9
smiling 13.7
face 13.5
black 13.2
couple 13.1
lifestyle 13
handsome 12.5
hat 12.3
fashion 12.1
office 12
looking 12
one 11.9
telecommunication system 11.9
women 11.9
executive 11.8
suit 11.7
job 11.5
human 11.2
worker 11.2
modern 10.5
grand piano 10.5
men 10.3
hair 10.3
veil 9.8
indoors 9.7
table 9.6
married 9.6
bouquet 9.6
marriage 9.5
elegant 9.4
work 9.4
alone 9.1
indoor 9.1
fun 9
success 8.8
working 8.8
casual 8.5
senior 8.4
communication 8.4
mature 8.4
piano 8.4
studio 8.4
holding 8.2
monitor 8.2
outdoors 8.2
technology 8.2
celebration 8
holiday 7.9
dance 7.8
bridal 7.8
chair 7.7
attractive 7.7
uniform 7.7
old 7.7
husband 7.6
two 7.6
notebook 7.5
phone 7.4
percussion instrument 7.4
lady 7.3
screen 7.3
color 7.2
sexy 7.2
smile 7.1
day 7.1

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

text 99.6
wall 96.4
clothing 93.3
man 91.9
person 91.6
human face 90
black and white 84.7
hat 80.2
fashion accessory 72.9

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Male, 85.4%
Calm 94.1%
Surprised 3.8%
Happy 0.6%
Fear 0.4%
Angry 0.4%
Sad 0.3%
Disgusted 0.3%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Hat 89.6%

Captions

Microsoft

a photo of a man 88.6%
a man standing in front of a television 71.4%
a man standing in front of a monitor 71.3%

Text analysis

Amazon

lb
CIYA