Human Generated Data

Title

Untitled (customers lined up to purchase meat, seen from behind shop counter)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14375

Human Generated Data

Title

Untitled (customers lined up to purchase meat, seen from behind shop counter)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14375

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Clothing 100
Apparel 100
Person 99.5
Human 99.5
Person 99.3
Person 99.1
Person 95.6
Person 95.3
Sunglasses 95.3
Accessories 95.3
Accessory 95.3
Sun Hat 93.3
Sunglasses 85.4
Sunglasses 79.2
Hat 74.2
Face 64.1
Cowboy Hat 59.4
Cap 57.7
Flower 57.7
Plant 57.7
Blossom 57.7
Bonnet 56.5
Hat 55.8
Person 47.6

Clarifai
created on 2023-10-27

people 99.9
group 97.8
vehicle 96.1
woman 95.5
monochrome 95.5
man 95
child 94.9
adult 94.8
administration 94
leader 93.8
transportation system 93.4
group together 92.6
sit 88.1
war 88
many 86.6
car 86.5
several 84.2
convertible 84.2
three 82.6
four 82.6

Imagga
created on 2022-01-29

case 30.3
table 27.7
home appliance 24.3
glass 23.2
interior 22.1
appliance 21.8
dishwasher 17.6
salon 16.6
dinner 16.1
home 15.9
luxury 15.4
furniture 15.3
kitchen 15.3
room 14.7
party 14.6
setting 14.4
dining 14.3
decor 14.1
indoors 14
white goods 13.9
food 13.4
modern 13.3
wedding 12.9
napkin 12.7
restaurant 12.6
house 12.5
flowers 12.2
service 12
event 12
chair 11.9
health 11.8
banquet 11.8
clean 11.7
fork 11.5
toaster 11.3
people 11.1
plate 11
kitchen appliance 10.9
desk 10.9
drink 10.9
decoration 10.8
lifestyle 10.8
reception 10.8
knife 10.6
medicine 10.6
formal 10.5
celebration 10.4
indoor 10
shop 9.8
tablecloth 9.8
cooking 9.6
arrangement 9.6
design 9.6
fine 9.5
machine 9.4
bouquet 9.4
work 9.4
architecture 9.4
wine 9.2
elegance 9.2
silverware 9
wed 8.8
working 8.8
medical 8.8
catering 8.8
dine 8.8
cutlery 8.8
fancy 8.7
person 8.7
contemporary 8.5
flower 8.5
durables 8.4
place 8.4
glasses 8.3
office 8.2
device 8.1
man 8.1
day 7.8
face 7.8
lunch 7.7
elegant 7.7
cloth 7.7
healthy 7.6
technology 7.4
computer 7.2
domestic 7.2
black 7.2
romance 7.1
science 7.1
silver 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.3
person 93.6
black and white 70.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Calm 97.5%
Sad 0.9%
Confused 0.5%
Happy 0.4%
Disgusted 0.2%
Angry 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 18-26
Gender Male, 98.4%
Calm 98.7%
Sad 0.7%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%
Angry 0.1%

AWS Rekognition

Age 52-60
Gender Male, 100%
Calm 84%
Happy 4.8%
Surprised 4.4%
Angry 1.9%
Sad 1.8%
Disgusted 1.3%
Confused 1%
Fear 0.9%

AWS Rekognition

Age 39-47
Gender Male, 99.3%
Calm 95.6%
Sad 2.3%
Confused 1.7%
Happy 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person
Sunglasses
Hat
Person 99.5%
Person 99.3%
Person 99.1%
Person 95.6%
Person 95.3%
Person 47.6%
Sunglasses 95.3%
Sunglasses 85.4%
Sunglasses 79.2%
Hat 74.2%
Hat 55.8%

Categories

Imagga

interior objects 98.2%

Text analysis

Amazon

us
SALER
SOCTER
Sweet
Credit