Human Generated Data

Title

Untitled (rephotographed early portrait of couple in living room)

Date

1935-1940, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10287

Human Generated Data

Title

Untitled (rephotographed early portrait of couple in living room)

People

Artist: Martin Schweig, American 20th century

Date

1935-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10287

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Furniture 99.4
Chair 99.4
Person 98.9
Human 98.9
Person 98.7
Interior Design 80.2
Indoors 80.2
Floor 75.9
Flooring 74.1
Home Decor 72.5
Room 67.5
Living Room 67.5
Meal 63.4
Food 63.4
Art 62.5
Text 62.2
Door 61.2
Electronics 58.6
Screen 58.6
Advertisement 58.2
Restaurant 57.5
Cafe 57.5
LCD Screen 57.2
Display 57.2
Monitor 57.2
Collage 56
Poster 56
Window 55.8

Clarifai
created on 2019-11-16

people 99.7
man 97
adult 97
room 96.7
group 96
furniture 93.8
woman 93.7
two 92.8
monochrome 92.1
wear 91.9
portrait 91.5
music 91.2
one 90.3
movie 89.8
television 89.3
military 88.6
chair 88.2
indoors 87.9
war 86.5
group together 86.2

Imagga
created on 2019-11-16

window 31.5
interior 30.1
case 27.5
building 25.6
room 23.9
architecture 23.5
house 23.4
home 22.3
chair 22
light 21.4
modern 18.9
indoors 17.6
furniture 17.5
indoor 16.4
city 15.8
inside 15.6
glass 15.6
office 14.1
door 14
shop 13.8
wall 13.7
business 13.4
urban 13.1
table 12.7
barbershop 12.6
design 12.4
man 12.1
old 11.8
restaurant 11.8
elegance 10.9
wood 10.8
black 10.8
musical instrument 10.7
lamp 10.6
ancient 9.5
people 9.5
luxury 9.4
decoration 9.4
structure 9.4
kitchen 9.3
silhouette 9.1
style 8.9
mercantile establishment 8.7
men 8.6
male 8.5
desk 8.4
street 8.3
travel 7.7
apartment 7.7
dining 7.6
television 7.6
equipment 7.5
floor 7.4
shadow 7.2
history 7.2
decor 7.1
working 7.1
computer 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99.1
furniture 95.4
person 91.7
black and white 91.1
indoor 89
clothing 87.8
table 82.8
black 81.5
man 80.6
white 79.3
chair 67.6
cluttered 12.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-22
Gender Female, 50.8%
Calm 54.3%
Sad 45.7%
Angry 45%
Disgusted 45%
Happy 45%
Surprised 45%
Fear 45%
Confused 45%

AWS Rekognition

Age 20-32
Gender Male, 54.9%
Disgusted 45%
Happy 45%
Angry 45.1%
Fear 45%
Sad 46.3%
Surprised 45%
Confused 45%
Calm 53.6%

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 22
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.4%
Person 98.9%

Categories

Text analysis

Amazon

24
Souis.
St.
Rina St. Souis.
Rina
TESOT

Google

St
Souis.
TLESDAY 24 Rino St Souis.
TLESDAY
24
Rino