Human Generated Data

Title

Untitled (view of women sitting around fancy drawing room with large windows)

Date

1959

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9657

Human Generated Data

Title

Untitled (view of women sitting around fancy drawing room with large windows)

People

Artist: Martin Schweig, American 20th century

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.5
Person 99.5
Person 99.3
Person 99.1
Person 99
Person 98.6
Person 98.6
Room 97.4
Indoors 97.4
Interior Design 96.8
Person 94.6
Living Room 91.2
Furniture 80
Leisure Activities 69.9
Couch 68.5
Plant 65.4
People 63.1
Floor 59.2
Flower 58.9
Blossom 58.9
Lobby 58.3
Crowd 56.8
Chair 56.7
Waiting Room 56

Imagga
created on 2022-01-23

salon 54
interior 50.4
shop 44.8
room 39.6
toyshop 39.4
table 37.4
home 32.7
mercantile establishment 31.1
house 30.9
furniture 26.8
chair 24.7
decor 23.9
decoration 21.7
window 21.6
indoors 21.1
place of business 20.9
luxury 20.6
modern 18.9
inside 18.4
indoor 18.2
design 15.7
elegant 15.4
comfortable 15.3
architecture 14.8
style 14.8
restaurant 14.6
apartment 13.4
people 13.4
glass 13.3
wood 12.5
dining 12.4
floor 12.1
living 11.4
light 11.4
kitchen 11.1
women 11.1
elegance 10.9
lamp 10.5
establishment 10.4
dinner 10.4
counter 10.3
lifestyle 10.1
fashion 9.8
family 9.8
sofa 9.6
person 9.6
party 9.5
wall 9.4
reception 8.8
kin 8.8
seat 8.6
sitting 8.6
mirror 8.6
old 8.4
classic 8.4
case 8.3
wine 8.3
vintage 8.3
antique 8.1
celebration 8
holiday 7.9
carpet 7.8
gift 7.7
residential 7.7
hotel 7.6
estate 7.6
drink 7.5
food 7.4
man 7.4
hospital 7.3
domestic 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

window 96.9
person 93.9
indoor 92.6
text 89.3
vase 86
table 83.4
furniture 57.5
candle 52.8
dinner 33.8
dining table 9.3

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 95.9%
Calm 95.4%
Surprised 2.5%
Fear 1.3%
Disgusted 0.3%
Sad 0.3%
Confused 0.2%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 48-56
Gender Male, 77.2%
Calm 99.8%
Happy 0.1%
Surprised 0%
Sad 0%
Angry 0%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 25-35
Gender Male, 87.2%
Calm 78.4%
Sad 16%
Confused 1.6%
Disgusted 1.1%
Surprised 1%
Happy 1%
Angry 0.5%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people sitting at a table 96%
a group of people sitting around a table 95.9%
a group of people sitting at a table in front of a window 92.2%

Text analysis

Amazon

calee