Human Generated Data

Title

Untitled (woman on bed inside run-down house)

Date

1957

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16000.1

Human Generated Data

Title

Untitled (woman on bed inside run-down house)

People

Artist: Jack Gould, American

Date

1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16000.1

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Room 99.4
Indoors 99.4
Furniture 99.3
Person 98.9
Human 98.9
Bedroom 98.8
Interior Design 89.9
Dressing Room 87.2
Dorm Room 85.3
Painting 84.6
Art 84.6
Bed 83.6
Living Room 69.1

Clarifai
created on 2023-10-29

people 99.9
two 98.8
adult 98.7
furniture 98.6
room 97.8
group 97.7
man 97.1
one 96.4
woman 94.7
three 93.5
sit 91.4
leader 91.2
home 89.8
wear 89.4
administration 89.3
print 88.5
child 88
chair 87.5
seat 87.5
group together 86.3

Imagga
created on 2022-02-05

sketch 100
drawing 99.5
representation 78.3
interior 30.1
room 28.2
home 26.3
house 25.1
snow 22
design 19.7
modern 19.6
furniture 18.1
glass 17.8
table 16.9
architecture 16.4
floor 14.9
style 14.8
window 14.8
wall 13.7
wood 13.3
urban 13.1
decor 12.4
black 12
city 11.6
decoration 11.6
residential 11.5
indoors 11.4
living 11.4
empty 11.2
elegant 11.1
inside 11
elegance 10.9
chair 10.6
comfortable 10.5
luxury 10.3
ice 10.1
indoor 10
light 10
seat 9.9
weather 9.7
pillow 9.7
sofa 9.6
apartment 9.6
building 9.5
travel 9.1
old 9.1
lamp 8.7
scene 8.7
cold 8.6
construction 8.6
relaxation 8.4
vintage 8.3
bedroom 8.2
comfort 7.7
bed 7.7
winter 7.7
drink 7.5
decorative 7.5
street 7.4
new 7.3
domestic 7.2
stylish 7.2

Google
created on 2022-02-05

Textile 87.5
Table 87
Dress 84.4
Style 83.8
Art 79.8
Font 75.7
Monochrome 73
Monochrome photography 70.9
Bed 70.6
Room 69.7
Vintage clothing 65.6
Comfort 63.5
Illustration 62.6
Visual arts 62.5
Fashion design 62.4
Linens 62.1
Bedding 60
Drawing 59.3
Painting 58.8
Sitting 55.4

Microsoft
created on 2022-02-05

text 99.3
indoor 97.5
drawing 96.9
wall 96.8
sketch 93.7
old 78.5
painting 68.3
white 63.8
art 54.1
room 54.1
vintage 37.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Female, 69%
Calm 31.3%
Surprised 26.5%
Fear 18.8%
Sad 13.9%
Angry 3.4%
Disgusted 2.2%
Confused 2.2%
Happy 1.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Painting
Person 98.9%
Painting 84.6%

Categories

Text analysis

Amazon

4
с
HOD