Human Generated Data

Title

Untitled (man and woman in striped dress posed at chair in living room)

Date

1948-1950

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9220

Human Generated Data

Title

Untitled (man and woman in striped dress posed at chair in living room)

People

Artist: Martin Schweig, American 20th century

Date

1948-1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9220

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.5
Human 98.5
Interior Design 98.4
Indoors 98.4
Person 98.3
Room 93.6
Furniture 84.7
Living Room 80.8
Bedroom 69.7
Clinic 62
Art 61.8
Drawing 59.4
Bathroom 58.1
Tie 57.7
Accessories 57.7
Accessory 57.7

Clarifai
created on 2023-10-27

people 99.8
monochrome 98.9
adult 98.3
woman 98
child 96.6
group 95.2
man 94.8
wear 93.9
chair 93.2
two 92.5
group together 91.3
actress 91.1
furniture 90.9
musician 90.1
seat 88.9
music 88
administration 87.9
three 87
recreation 85.6
movie 85.6

Imagga
created on 2022-01-23

device 23
person 20.2
man 18.8
people 18.4
adult 18.3
exercise bike 17.2
lifestyle 15.9
male 15.6
house 14.2
exercise device 13.9
musical instrument 13.8
chair 13.6
interior 13.3
men 12.9
home 12.8
black 12.6
portrait 11.6
seat 11.4
indoors 11.4
room 11.3
bass 10.9
salon 10.7
happy 10.6
old 10.4
equipment 10.2
blackboard 10.2
sketch 9.6
women 9.5
sitting 9.4
youth 9.4
face 9.2
fashion 9
dress 9
work 9
drawing 9
one 9
style 8.9
crutch 8.8
smiling 8.7
grunge 8.5
travel 8.4
elegance 8.4
floor 8.4
holding 8.2
human 8.2
window 8.2
domestic 8.1
working 7.9
hair 7.9
day 7.8
art 7.8
staff 7.8
furniture 7.8
pretty 7.7
cleaner 7.6
studio 7.6
hand 7.6
building 7.5
rocking chair 7.5
leisure 7.5
clothing 7.4
cheerful 7.3
music 7.2
smile 7.1
happiness 7
architecture 7
modern 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 92.7
clothing 84.9
person 76.7
black and white 70.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Female, 84.3%
Calm 68.3%
Sad 20.4%
Happy 9.3%
Angry 0.4%
Fear 0.4%
Confused 0.4%
Surprised 0.4%
Disgusted 0.3%

AWS Rekognition

Age 48-54
Gender Female, 98.6%
Calm 93.5%
Fear 2%
Confused 1.4%
Surprised 1%
Happy 0.7%
Sad 0.6%
Angry 0.5%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Tie 57.7%

Categories

Imagga

interior objects 94.8%
paintings art 4.5%

Captions

Microsoft
created on 2022-01-23

an old photo of a man 72.5%
a man standing in a room 72.4%
old photo of a man 68.4%

Text analysis

Amazon

YT37A2
M 11 YT37A2
M 11