Human Generated Data

Title

Untitled (woman in formal dress standing at end of long table in formal room)

Date

1929, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12972

Human Generated Data

Title

Untitled (woman in formal dress standing at end of long table in formal room)

People

Artist: Paul Gittings, American 1900 - 1988

Date

1929, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12972

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99
Human 99
Apparel 94.5
Clothing 94.5
Gown 87.2
Fashion 87.2
Evening Dress 87.2
Robe 87.2
Furniture 81.8
Female 78.2
Sleeve 77.6
Indoors 74.6
Glass 71.3
Room 70
Table 67.4
Photo 63.2
Photography 63.2
Face 63.2
Portrait 63.2
Woman 61.3
Corridor 58.6
Floor 58.1
Goblet 57.5
Flooring 56.7
Pub 55.5

Clarifai
created on 2019-11-16

people 100
adult 99.3
one 99.2
two 99.1
group 97
furniture 96.6
group together 96.3
man 96.3
woman 95.1
administration 94.5
leader 93
wear 92.6
three 91.3
four 89.7
seat 89.5
room 89.3
home 88.6
music 88
outfit 87.2
military 86.4

Imagga
created on 2019-11-16

bartender 22.4
man 17.5
old 16.7
case 14.9
people 14.5
building 14.1
counter 14
shop 13.4
interior 13.3
light 12.7
glass 11.6
window 11.5
furniture 11.1
architecture 11
chair 10.8
restaurant 10.2
indoor 10
dark 10
person 10
wood 10
stall 10
male 9.9
history 9.8
business 9.7
indoors 9.7
table 9.6
inside 9.2
city 9.1
hall 9
home 8.8
urban 8.7
women 8.7
water 8.7
men 8.6
adult 8.5
travel 8.4
design 8.4
black 8.4
modern 8.4
house 8.4
vintage 8.3
room 8
holiday 7.9
scene 7.8
color 7.8
mercantile establishment 7.7
structure 7.7
fashion 7.5
silhouette 7.4
style 7.4
historic 7.3
lifestyle 7.2
instrument 7.2
life 7.2
religion 7.2
wet 7.1
steel 7.1
work 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.7
person 95.3
clothing 94.5
indoor 93.5
black and white 92.2
white 69.2
monochrome 57.5
woman 53.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 13-23
Gender Female, 54.4%
Happy 45%
Angry 45%
Disgusted 45%
Sad 45.1%
Surprised 45%
Calm 54.9%
Fear 45%
Confused 45%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Categories