Human Generated Data

Title

Untitled (interior of corner of room containing chair and grandfather clock)

Date

1925-1940, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10299

Human Generated Data

Title

Untitled (interior of corner of room containing chair and grandfather clock)

People

Artist: Martin Schweig, American 20th century

Date

1925-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10299

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Chair 99.6
Furniture 99.6
Person 97.3
Human 97.3
Clothing 95.5
Apparel 95.5
Flooring 90
Floor 83.6
Overcoat 81
Suit 81
Coat 81
Indoors 67.7
Room 66.7
Living Room 65
Mannequin 61.2
Couch 58.9
Door 51.7

Clarifai
created on 2019-11-16

people 99.3
room 97.3
man 96.8
adult 96.4
furniture 94.7
indoors 94.6
woman 92.4
door 92.1
wear 91.7
window 90.9
doorway 90.8
group 90.1
no person 90
light 88.1
family 88
one 87.9
home 87.9
two 86.6
art 85.1
architecture 84.5

Imagga
created on 2019-11-16

window 31.8
barbershop 25.4
architecture 25.1
shop 22.7
building 22.7
interior 21.2
house 19.4
door 17.7
home 17.5
boutique 17.2
mercantile establishment 17.2
room 16.7
city 16.6
wall 16.5
old 16
light 14.7
inside 13.8
historic 13.8
antique 13.2
structure 12.6
history 12.5
glass 12.4
furniture 11.8
place of business 11.5
urban 11.4
historical 11.3
ancient 11.2
black 10.8
tourism 10.7
case 10.7
indoors 10.5
modern 10.5
art 10.4
design 10.1
lamp 10.1
indoor 10
travel 9.9
chair 9.5
vintage 9.1
people 8.9
office 8.7
windows 8.6
monument 8.4
town 8.3
silhouette 8.3
table 8.2
style 8.2
decoration 8
marble 7.7
construction 7.7
elegance 7.6
retro 7.4
street 7.4
color 7.2
religion 7.2
decor 7.1
framework 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 97.4
indoor 88.7
black and white 74.5
person 72.7
gallery 71.9
clothing 68.2
white 65.2
room 59.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 15-27
Gender Female, 50.4%
Calm 54.7%
Fear 45%
Sad 45.1%
Happy 45.1%
Disgusted 45%
Surprised 45%
Confused 45%
Angry 45%

Microsoft Cognitive Services

Age 41
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.6%
Person 97.3%
Door 51.7%

Categories

Captions

Text analysis

Amazon

IB0E