Human Generated Data

Title

Untitled (people dancing at party)

Date

c. 1950

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19425

Human Generated Data

Title

Untitled (people dancing at party)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19425

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99
Human 99
Shoe 98
Footwear 98
Clothing 98
Apparel 98
Person 94.3
Person 84.8
Person 74.4
People 70.2
Text 66.4
Shorts 62.4
Advertisement 59.2
Poster 58.5
Staircase 56.8
Handrail 56.2
Banister 56.2
Person 43.7

Clarifai
created on 2023-10-22

people 99.3
man 98.5
monochrome 96.9
adult 91.2
woman 91
music 89.8
street 89.3
indoors 87.8
two 86.3
wedding 85.4
portrait 84.9
wear 83.9
one 83.1
musician 78.6
group 77.4
dancing 76.5
many 76
side view 73.8
child 73.5
retro 71.9

Imagga
created on 2022-03-05

person 33.9
people 27.3
adult 25.3
dress 22.6
fashion 21.1
portrait 18.8
man 18.1
human 17.2
attractive 16.8
city 16.6
male 16.4
sexy 16.1
street 15.6
walking 15.2
women 15
happiness 14.9
posing 14.2
one 14.2
clothing 14.1
model 14
elegance 13.4
pretty 13.3
love 12.6
wall 12.3
crutch 12.3
urban 12.2
summer 12.2
couple 12.2
men 12
hair 11.9
style 11.9
casual 11.9
black 11.6
lifestyle 11.6
outdoor 11.5
smile 11.4
face 11.4
lady 11.4
groom 11
modern 10.5
walk 10.5
looking 10.4
life 10.2
cute 10
cool 9.8
body 9.6
bride 9.6
staff 9.6
standing 9.6
world 9.5
motion 9.4
happy 9.4
action 9.3
wedding 9.2
alone 9.1
sensuality 9.1
romantic 8.9
umbrella 8.7
day 8.6
fashionable 8.5
stand 8.5
youth 8.5
outdoors 8.4
stick 8.4
suit 8.3
adolescent 8.1
business 7.9
brunette 7.8
travel 7.7
elegant 7.7
outside 7.7
juvenile 7.7
hand 7.6
head 7.6
closeup 7.4
retro 7.4
girls 7.3
smiling 7.2
color 7.2
child 7.2
pedestrian 7.2
performer 7.2
businessman 7.1
look 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 96.1
person 95.2
outdoor 94.5
black and white 86.1
clothing 73.2
wedding dress 65.7
woman 53.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 52-60
Gender Female, 83.6%
Calm 35.3%
Fear 28.9%
Surprised 13.6%
Angry 9.5%
Disgusted 4.4%
Happy 3.9%
Sad 2.2%
Confused 2.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99%
Person 94.3%
Person 84.8%
Person 74.4%
Person 43.7%
Shoe 98%

Categories

Text analysis

Amazon

D
6
-
tirn
MAQOX
KOOKX A 2 - EELA tirn
2017 tEiA tirn
tEiA
KOOKX
2017
A 2
EELA

Google

KODYK KODYK 2.r EEIA EIrn
KODYK
2.r
EEIA
EIrn