Human Generated Data

Title

Untitled (women selling food at outdoor stand)

Date

c. 1950

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19401

Human Generated Data

Title

Untitled (women selling food at outdoor stand)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19401

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Person 99.6
Person 99.4
Person 99.3
Person 98.4
Person 90.1
Tent 71.5
Clothing 69
Apparel 69
Food 68.6
Cafeteria 65.6
Restaurant 65.6
Meal 59.9
Shop 58.1

Clarifai
created on 2023-10-22

people 99.9
group together 98.9
adult 98.5
man 98.4
many 98.3
group 97.8
monochrome 96.4
woman 94.9
vehicle 91.8
several 89.3
administration 86.5
street 86.1
recreation 85.3
tent 84.6
wear 84.4
child 84.3
military 78.7
furniture 76.3
transportation system 74.6
crowd 73.6

Imagga
created on 2022-03-05

shop 35.4
marimba 35.1
percussion instrument 32
musical instrument 30.7
man 27.6
people 26.2
mercantile establishment 25.4
male 23.4
person 20.4
place of business 17
tobacco shop 16.7
men 15.4
adult 15.4
office 15.4
room 15.4
business 15.2
smiling 14.5
black 14.4
lifestyle 12.3
sitting 12
happy 11.9
businessman 11.5
work 11.1
barbershop 11.1
horizontal 10.9
classroom 10.7
hand 10.6
indoors 10.5
old 10.4
bakery 10.2
finance 10.1
back 10.1
human 9.7
to 9.7
professional 9.5
education 9.5
occupation 9.2
holding 9.1
portrait 9
computer 8.9
negative 8.7
film 8.7
counter 8.7
day 8.6
senior 8.4
establishment 8.4
blackboard 8.3
worker 8.3
school 8.2
table 8.2
looking 8
working 7.9
women 7.9
smile 7.8
restaurant 7.8
class 7.7
one 7.5
vintage 7.4
technology 7.4
symbol 7.4
camera 7.4
student 7.2
copy space 7.2
suit 7.2
team 7.2
handsome 7.1
financial 7.1
happiness 7
teacher 7
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99.1
person 93.1
clothing 84.8
black and white 77.6
man 73.4
group 64.4
old 50
cooking 24.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 74.3%
Sad 78.6%
Happy 18.8%
Calm 1.1%
Surprised 0.6%
Confused 0.5%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 38-46
Gender Male, 96.6%
Calm 52.6%
Sad 34.3%
Confused 4.7%
Happy 3.2%
Disgusted 1.9%
Angry 1.7%
Surprised 0.8%
Fear 0.8%

AWS Rekognition

Age 33-41
Gender Male, 70.5%
Happy 81.7%
Calm 9.1%
Sad 2.8%
Confused 2.3%
Angry 1.2%
Surprised 1.1%
Fear 0.9%
Disgusted 0.9%

AWS Rekognition

Age 36-44
Gender Male, 99.4%
Sad 69.6%
Calm 18.8%
Happy 4.2%
Confused 4.2%
Angry 1.6%
Disgusted 0.7%
Surprised 0.5%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tent
Person 99.7%
Person 99.6%
Person 99.4%
Person 99.3%
Person 98.4%
Person 90.1%
Tent 71.5%

Text analysis

Amazon

KODAK
SAFETY
PRODUCTO
EL PRODUCTO
EL
EILM
ROESSLER
CLUB
ROESLAR'S

Google

..... KODAK SA EEIY E!M KODA K SA E ETY EILM
.....
KODAK
SA
EEIY
E!M
KODA
K
E
ETY
EILM