Human Generated Data

Title

Untitled (interior view of crowded grocery store)

Date

1950-1953

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6319

Human Generated Data

Title

Untitled (interior view of crowded grocery store)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1950-1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6319

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.4
Person 99.4
Person 98.8
Person 96.1
Person 92.8
Person 92.1
Person 87.3
Person 86.4
Person 84.1
Person 78.6
Apparel 73.7
Clothing 73.7
Pedestrian 73
Person 72.8
Person 72.7
Person 71.6
Person 69.2
People 67.6
Crowd 56.5
Urban 56.4
Person 54.2
Person 51.6

Clarifai
created on 2023-10-27

people 99.9
many 99.5
group 99.2
group together 98.1
street 96.6
crowd 96.5
woman 96.4
man 95.2
monochrome 94.8
adult 94.4
commerce 92.8
recreation 88.4
wear 85.8
child 85
administration 80.3
war 79.8
furniture 78.3
audience 78.3
market 77.5
spectator 77.4

Imagga
created on 2022-01-22

man 22.2
person 20.6
people 20.1
black 18.7
dark 15.9
silhouette 15.7
male 15.6
night 15.1
men 14.6
sax 13.2
sexy 12.8
wind instrument 12.8
adult 12.7
world 12.1
light 12
women 11.9
business 10.9
music 10.9
equipment 10.8
bass 10.7
stage 10.7
crowd 10.6
portrait 10.3
smoke 10.2
spectator 10.2
symbol 10.1
musical instrument 10.1
brass 10.1
team 9.8
fashion 9.8
group 9.7
style 9.6
art 9.5
club 9.4
work 9.4
power 9.2
dance 9.2
city 9.1
photographer 9.1
concert 8.7
party 8.6
youth 8.5
lights 8.3
danger 8.2
industrial 8.2
interior 8
lifestyle 7.9
indoors 7.9
design 7.9
couple 7.8
nightlife 7.8
model 7.8
play 7.7
motion 7.7
old 7.7
hot 7.5
sign 7.5
one 7.5
disco 7.5
job 7.1
businessman 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99
clothing 96.9
person 96.1
black and white 89.4
man 79.8
people 63.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 71.3%
Calm 99.2%
Sad 0.5%
Happy 0.1%
Confused 0.1%
Fear 0%
Angry 0%
Disgusted 0%
Surprised 0%

AWS Rekognition

Age 18-24
Gender Female, 80.8%
Sad 47.3%
Calm 37.2%
Confused 6.3%
Disgusted 3.2%
Fear 1.8%
Angry 1.5%
Happy 1.5%
Surprised 1.2%

AWS Rekognition

Age 29-39
Gender Female, 63.5%
Calm 75.8%
Happy 9.6%
Angry 4.1%
Surprised 3%
Sad 2.8%
Disgusted 2.1%
Confused 1.4%
Fear 1.2%

AWS Rekognition

Age 31-41
Gender Male, 99%
Calm 91.5%
Sad 4.6%
Happy 1.5%
Angry 0.8%
Surprised 0.6%
Disgusted 0.6%
Confused 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Text analysis

Amazon

2
CLOSED
OPS
CASH
Vap
MILK
NOT
MIL
E
CASH EASIS
EV
E-Vap
BEGINNING
4
EASIS
FINEAPPLE NOT
EVA
E-V
.
BEGINNING JULY
MUS
E.V.
E:V
FINEAPPLE
MILI
JULY
it
MILA
LM
Arnatio
tration
LAN
LAN PINE
PINE
Create
arnatio
44769

Google

OPS 2 PINAPE BEGNNING JLYE CASH BASIS CLOSED
OPS
PINAPE
JLYE
2
BEGNNING
CASH
BASIS
CLOSED