Human Generated Data

Title

Untitled (men around desk looking at oil filters)

Date

c. 1950

People

Artist: Lester Cole, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19636

Human Generated Data

Title

Untitled (men around desk looking at oil filters)

People

Artist: Lester Cole, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19636

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Tabletop 99.8
Furniture 99.8
Person 99.1
Human 99.1
Person 99
Table 98.8
Person 98.7
Person 97.3
Person 96.6
Desk 94.2
Clothing 93.9
Apparel 93.9
Dining Table 71.3
Chair 70.7
Indoors 69.5
Room 67
Tie 63.2
Accessories 63.2
Accessory 63.2
Sitting 61.6
Photography 60.7
Photo 60.7
Female 59.4
Machine 58.1
Pants 57.6
Sewing 55.3
Face 55.2

Clarifai
created on 2023-10-22

people 99.8
group 98.1
furniture 98
group together 97.9
adult 97.5
man 96.2
leader 95.9
administration 95.6
woman 92.8
room 92.3
several 90.8
two 90.6
recreation 89.3
war 89.2
four 88.7
child 88.5
chair 88.4
actor 87.4
home 86.6
three 85.5

Imagga
created on 2022-03-05

kitchen 45.8
stove 45.1
interior 36.2
gramophone 33.5
home 33.5
house 28.4
machine 26.9
record player 26.9
indoors 22.8
counter 21.7
man 20.8
device 20.7
modern 20.3
cooking 20.1
oven 19.7
room 19
person 18.7
food 18.2
cook 17.4
lifestyle 17.3
appliance 17
indoor 16.4
furniture 16.4
table 16.1
male 14.9
luxury 14.6
wood 14.2
people 13.9
men 13.7
adult 13.7
design 13.5
decor 13.2
chair 13.2
home appliance 13
smiling 13
waiter 12.9
inside 12.9
happy 12.5
sitting 12
dishwasher 11.8
dinner 11.8
happiness 11.7
sink 10.9
cabinet 10.8
steel 10.7
musical instrument 10.5
floor 10.2
architecture 10.1
glass 10.1
restaurant 10
white goods 10
equipment 9.9
holding 9.9
cheerful 9.7
bartender 9.6
work 9.5
fun 9
meal 8.9
style 8.9
granite 8.8
desk 8.8
women 8.7
standing 8.7
drinking 8.6
preparation 8.6
party 8.6
tile 8.5
smile 8.5
contemporary 8.5
domestic 8.4
new 8.1
dining-room attendant 8
office 8
job 8
decoration 8
breakfast 7.9
working 7.9
preparing 7.8
chef 7.8
stainless 7.7
employee 7.7
apartment 7.7
residential 7.7
commercial 7.5
light 7.3
occupation 7.3
business 7.3
color 7.2
celebration 7.2
worker 7.2
professional 7
life 7

Google
created on 2022-03-05

Black 89.8
Black-and-white 85.2
Style 83.9
Rectangle 76.1
Monochrome photography 75.2
Monochrome 75.1
Cooking 70.8
Font 70.4
Suit 70.3
Room 70.2
Table 70.1
Art 68.7
History 63.3
Stock photography 63.1
Vintage clothing 62.9
Sitting 61.7
Glass 60.2
Machine 57.4
Advertising 57.4
Visual arts 57.1

Microsoft
created on 2022-03-05

person 96.7
text 93.9
black and white 91.5
man 90.8
clothing 89.9
table 84.4
furniture 69.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 95.5%
Calm 99.9%
Sad 0.1%
Surprised 0%
Disgusted 0%
Fear 0%
Happy 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 29-39
Gender Male, 95.4%
Sad 50.4%
Calm 41.2%
Fear 3.9%
Confused 2.3%
Surprised 0.8%
Angry 0.7%
Happy 0.3%
Disgusted 0.3%

AWS Rekognition

Age 43-51
Gender Female, 51.7%
Sad 46.3%
Calm 38.1%
Happy 12.6%
Confused 0.9%
Disgusted 0.6%
Angry 0.6%
Fear 0.5%
Surprised 0.5%

AWS Rekognition

Age 52-60
Gender Male, 85.9%
Sad 90.4%
Calm 8.6%
Surprised 0.3%
Confused 0.3%
Angry 0.2%
Disgusted 0.1%
Happy 0%
Fear 0%

AWS Rekognition

Age 47-53
Gender Male, 96.8%
Sad 49.8%
Calm 16.2%
Happy 13.2%
Angry 9.2%
Disgusted 3.8%
Confused 3.2%
Surprised 2.5%
Fear 2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Person 99.1%
Person 99%
Person 98.7%
Person 97.3%
Person 96.6%
Tie 63.2%

Text analysis

Amazon

B

Google

YT37A2-XAG
YT37A2-XAG