Human Generated Data

Title

Untitled (debutantes)

Date

1964

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19219

Human Generated Data

Title

Untitled (debutantes)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1964

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.2
Human 98.2
Person 96.9
Person 96.7
Person 96.3
Person 94
Person 91.5
Person 91
Person 89.8
Person 89.7
Apparel 89.3
Clothing 89.3
Person 87.7
Person 85.9
Person 84
Person 73.5
Overcoat 65.7
Coat 65.7
Female 64.6
Indoors 59.5
Suit 57.8
Room 57.8
Crowd 57.1
Gate 56.7
Photo 56.5
Photography 56.5
Furniture 55
Chair 55

Imagga
created on 2022-03-05

business 38.3
people 32.3
gate 31.3
office 28
passenger 27.9
turnstile 23.7
man 23.5
men 23.2
building 23.2
corporate 22.3
businessman 21.2
women 20.6
urban 20.1
adult 19.6
window 19.5
group 19.3
hall 18.7
city 18.3
modern 18.2
travel 16.9
interior 16.8
businesswoman 16.4
male 16.3
counter 16.2
movable barrier 15.3
team 15.2
work 14.9
silhouette 14.9
architecture 14.7
professional 14.3
transportation 14.3
meeting 14.1
life 14
airport 13.9
suit 13.6
black 13.2
indoors 13.2
shop 13
teamwork 13
success 12.9
person 12.7
happy 12.5
executive 12.2
job 11.5
working 11.5
room 11.3
sitting 11.2
inside 11
indoor 11
chair 10.8
barrier 10.4
manager 10.2
glass 10.1
departure 9.8
corridor 9.8
walk 9.5
table 9.5
walking 9.5
career 9.5
journey 9.4
light 9.4
barbershop 9.3
floor 9.3
worker 9
clothing 9
mall 8.8
lifestyle 8.7
education 8.7
scene 8.7
crowd 8.6
boss 8.6
talking 8.6
businesspeople 8.5
trip 8.5
communication 8.4
attractive 8.4
fashion 8.3
successful 8.2
transport 8.2
smiling 8
subway 7.9
luggage 7.9
smile 7.8
conference 7.8
colleagues 7.8
station 7.7
check 7.7
two 7.6
ethnic 7.6
finance 7.6
company 7.4
tourism 7.4
vacation 7.4
reflection 7.3
portrait 7.1
day 7.1
mercantile establishment 7
together 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 96.6
clothing 89.7
black and white 80.1
woman 59.7
text 59.1

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 90.5%
Sad 82.6%
Happy 14.5%
Calm 1.2%
Surprised 0.6%
Disgusted 0.6%
Confused 0.2%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Calm 99.2%
Sad 0.3%
Happy 0.2%
Disgusted 0.1%
Confused 0.1%
Surprised 0.1%
Fear 0.1%
Angry 0%

AWS Rekognition

Age 36-44
Gender Male, 98.8%
Sad 80.3%
Disgusted 5.9%
Confused 5%
Angry 2.7%
Calm 1.9%
Fear 1.7%
Surprised 1.3%
Happy 1.1%

AWS Rekognition

Age 27-37
Gender Female, 64.6%
Calm 97.2%
Happy 0.8%
Sad 0.8%
Disgusted 0.4%
Surprised 0.4%
Confused 0.2%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 29-39
Gender Female, 93.5%
Calm 23.8%
Sad 23.2%
Confused 23%
Fear 10.5%
Happy 6.4%
Surprised 5.4%
Disgusted 5.4%
Angry 2.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 98.2%

Captions

Microsoft

a group of people standing in front of a window 85.1%
a group of people standing in front of a building 85%
a group of people posing for a photo 82.8%

Text analysis

Amazon

9 P.M
в
GUERTS
CHACK GUERTS
CHACK
MAGOM
LaSalett
MJ17 YT37A*2 MAGOM
MJ17 YT37A*2

Google

SPM STS MJI7 YT37 A2 MAGOX
YT37
A2
SPM
MJI7
MAGOX
STS