Human Generated Data

Title

Untitled (older women in living room, with two black cats looking towards camera)

Date

1957

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18711

Human Generated Data

Title

Untitled (older women in living room, with two black cats looking towards camera)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18711

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Furniture 96.8
Chair 93.7
Clothing 90.3
Apparel 90.3
Human 88.7
Room 81
Indoors 81
Face 80.3
Person 75.6
Living Room 74.5
Flooring 71.6
Sitting 68.9
Shoe 66.3
Footwear 66.3
Portrait 66.3
Photography 66.3
Photo 66.3
Floor 63.5
Bedroom 63.2
Couch 58.1
Dressing Room 58
Bed 56.8

Clarifai
created on 2023-10-22

people 99.9
music 96.8
furniture 96.8
adult 96.2
two 95.6
woman 95.3
seat 95.3
chair 94.2
piano 93.7
group 91.6
portrait 91.5
man 90.1
actress 90
home 89.2
wear 89.1
room 89
one 88.7
child 86.2
leader 85.7
musician 84.4

Imagga
created on 2022-02-25

chair 62.9
seat 44.9
room 26.1
furniture 26
interior 25.6
home 20.7
man 19.5
people 19
armchair 17.9
sitting 17.2
musical instrument 16.5
wheelchair 16
person 15.9
adult 15.7
indoors 14.9
table 13.9
inside 13.8
window 13.7
male 13.5
style 13.3
rocking chair 13.3
lifestyle 13
living 12.3
floor 12.1
modern 11.9
relax 11.8
barber chair 11.7
house 11.7
support 11.6
urban 11.4
old 11.1
stringed instrument 11.1
health 11.1
architecture 10.9
business 10.9
fashion 10.5
bowed stringed instrument 10.4
luxury 10.3
hospital 10.2
sofa 10.1
indoor 10
shop 9.9
portrait 9.7
lamp 9.5
work 9.5
men 9.4
travel 9.1
design 9
office 9
device 8.8
happy 8.8
women 8.7
vehicle 8.5
wood 8.3
city 8.3
building 8.3
street 8.3
furnishing 8.3
vintage 8.3
tourist 8.2
accordion 8.1
decor 8
working 7.9
leather 7.7
wall 7.7
keyboard instrument 7.7
reading 7.6
salon 7.5
senior 7.5
one 7.5
alone 7.3
computer 7.2
transportation 7.2
barbershop 7.1
patient 7.1
medical 7.1

Microsoft
created on 2022-02-25

indoor 99.2
chair 97.7
floor 96.7
room 91.2
text 84.2
black and white 83.9
table 81.9
living 78.8
furniture 24.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 56-64
Gender Female, 69.3%
Confused 45%
Happy 20.7%
Calm 12.6%
Sad 8.6%
Disgusted 4.4%
Surprised 4.1%
Angry 3%
Fear 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 75.6%
Shoe 66.3%

Categories

Captions

Text analysis

Amazon

-
3-D -
3-D

Google

3-Dmanaie
3-Dmanaie