Human Generated Data

Title

Untitled (sitting around table on porch, Suffolk, Virginia)

Date

c. 1931, printed later

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.51

Human Generated Data

Title

Untitled (sitting around table on porch, Suffolk, Virginia)

People

Artist: Hamblin Studio, American active 1930s

Date

c. 1931, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.51

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Restaurant 99.8
Chair 99.6
Furniture 99.6
Chair 99.2
Person 99.2
Human 99.2
Person 99.1
Cafe 98.9
Person 95.5
Person 94.1
Person 93.6
Cafeteria 91.5
Person 90.6
Room 85.4
Indoors 85.4
Living Room 75.5
Wheel 75.2
Machine 75.2
Person 73.3
Person 72.6
Bicycle 70.4
Transportation 70.4
Vehicle 70.4
Bike 70.4
Dining Room 67.6
Chair 66.2
Person 64.7
Person 61.2
Housing 60.6
Building 60.6
Porch 58.7
Couch 58.6
Meal 58.1
Food 58.1
Sitting 57
Interior Design 57
Dining Table 55.2
Table 55.2

Clarifai
created on 2023-10-15

people 100
group together 98.9
furniture 98.6
group 98.2
adult 97.6
monochrome 97.4
man 97.2
child 97
chair 95.7
street 93.4
seat 93.1
home 93.1
recreation 92.3
woman 91.8
wear 91
many 90.9
several 89.6
boy 88.9
administration 88.6
five 88.5

Imagga
created on 2021-12-14

patio 81.3
area 63.6
chair 59.1
table 57.1
structure 55.4
interior 52.2
room 52.2
furniture 47.4
house 41
home 39.1
floor 32.6
restaurant 32.2
modern 31.6
decor 31
window 29.8
living 28.5
wood 28.4
architecture 28.1
seat 27.6
design 26.5
building 24.3
luxury 24
sofa 23
chairs 21.6
inside 21.2
residential 21.1
cafeteria 20.7
lamp 20
indoor 18.3
comfortable 17.2
empty 17.2
estate 17.1
apartment 16.3
hotel 16.2
contemporary 16
decoration 15.9
residence 15.6
relaxation 15.1
indoors 14.9
light 14.7
door 14.7
couch 14.5
relax 14.3
real 14.2
style 14.1
glass 14
upscale 13.8
decorate 13.3
carpet 12.7
nobody 12.5
elegant 12
wall 12
furnishings 11.8
hardwood 11.8
lighting 11.6
dining 11.4
leather 11.4
fireplace 11.1
office 10.7
classroom 10.2
space 10.1
3d 10.1
rug 9.9
family 9.8
lifestyle 9.4
armchair 9.2
sliding door 9.2
plant 9
suburbs 8.9
sun 8.9
dwelling 8.8
wooden 8.8
urban 8.7
hall 8.6
sunny 8.6
food 8.5
travel 8.5
stone 8.4
dinner 8.4
outdoor 8.4
place 8.4
resort 8.3
ottoman 8.2
stylish 8.1
domestic 8.1
reflection 8.1
new 8.1
fixtures 7.9
suburban 7.9
terrace 7.9
lounge 7.8
rest 7.6
desk 7.6
elegance 7.6
anteroom 7.5
salon 7.4
bar 7.4
kitchen 7.4
vacation 7.4
warm 7.4
deck 7.3
summer 7.1

Google
created on 2021-12-14

Furniture 93.3
Plant 93
Chair 86.6
Table 84.8
Tree 84.7
Black-and-white 84.6
Style 83.9
Line 81.7
Adaptation 79.3
Monochrome photography 77.8
Monochrome 77.8
Building 77.4
Tints and shades 76.7
Room 69.7
Houseplant 68.8
Window 66.2
Rectangle 65.9
Art 65.5
Stock photography 64.5
Event 64.3

Microsoft
created on 2021-12-14

chair 97.2
table 96.1
window 96.1
indoor 95.9
black and white 93.6
text 93.5
floor 92.2
room 91
living 85.5
person 73.7
kitchen & dining room table 72.4
clothing 67.8
man 57.3
furniture 53.5
several 10.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-56
Gender Male, 91.9%
Calm 64%
Happy 21%
Sad 13.2%
Fear 0.5%
Confused 0.5%
Angry 0.4%
Surprised 0.3%
Disgusted 0.1%

AWS Rekognition

Age 47-65
Gender Male, 97.4%
Calm 92%
Sad 4.8%
Angry 0.9%
Confused 0.8%
Happy 0.7%
Surprised 0.5%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 23-37
Gender Male, 99.3%
Calm 90.9%
Sad 6.8%
Angry 0.8%
Confused 0.6%
Happy 0.3%
Surprised 0.3%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 29-45
Gender Male, 98.6%
Calm 72%
Sad 27%
Confused 0.4%
Angry 0.3%
Happy 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 34-50
Gender Female, 96%
Calm 71.4%
Happy 28.1%
Sad 0.2%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Confused 0%
Fear 0%

AWS Rekognition

Age 36-52
Gender Male, 87.8%
Calm 54.7%
Sad 40.4%
Angry 2.5%
Happy 1.2%
Confused 0.4%
Surprised 0.3%
Fear 0.3%
Disgusted 0.2%

AWS Rekognition

Age 25-39
Gender Male, 99.3%
Calm 42.7%
Sad 42.5%
Confused 7.6%
Angry 5.1%
Fear 1.1%
Disgusted 0.4%
Surprised 0.4%
Happy 0.3%

AWS Rekognition

Age 21-33
Gender Male, 62%
Calm 92.5%
Happy 5.7%
Sad 1.3%
Surprised 0.2%
Fear 0.1%
Angry 0.1%
Disgusted 0%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.6%
Person 99.2%
Wheel 75.2%
Bicycle 70.4%

Categories

Imagga

interior objects 98.3%

Text analysis

Amazon

83
63%
1931
Studio
Virginia
10
190
Hamblin
Hamblin Studio (W.E.A. Moore)
Moore)
(W.E.A.
Suffolk, Virginia
Suffolk,
с. 1931
CROP
с.
crop

Google

83 63% Hamblin Studio (W.E.A. Moore) Suffolk, Virginia c. 1931
83
63%
Hamblin
Studio
(W.E.A.
Moore)
Suffolk,
Virginia
c.
1931