Human Generated Data

Title

Untitled (auditorium seated with women at appliance promotional)

Date

1952-1957

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6377

Human Generated Data

Title

Untitled (auditorium seated with women at appliance promotional)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1952-1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6377

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Audience 100
Human 100
Crowd 100
Person 98.7
Person 97.9
Person 95
Person 94.6
Person 85.1
Person 81.7
Person 74.7
People 72.7
Person 72.3
Speech 71.9
Indoors 70.6
Person 68.6
Photography 68
Face 68
Photo 68
Portrait 68
Lecture 59.7
Room 59
Classroom 59
School 59
Seminar 57
Interior Design 55.1

Clarifai
created on 2019-03-22

crowd 99.8
many 99.8
people 99.7
audience 99.5
group together 99.4
group 98.4
man 96.5
spectator 95.2
adult 93.6
sports fan 88.9
woman 88.1
leader 87.5
recreation 86.8
meeting 86.7
chair 86.7
music 85.6
outlined 85.5
concert 83.8
sitting 83.5
war 83.1

Imagga
created on 2019-03-22

spectator 40
classroom 33.4
room 31.6
group 20.1
crowd 18.2
city 16.6
people 16.2
travel 15.5
landscape 13.4
pattern 13
disco 12.1
texture 11.1
blackboard 10.8
water 10.7
surface 10.6
business 10.3
men 10.3
event 10.2
architecture 10.1
building 10.1
silhouette 9.9
modern 9.8
aerial 9.7
urban 9.6
person 9.6
design 9.6
hall 9.5
row 9.3
entrepreneur 9.2
walking 8.5
cityscape 8.5
male 8.5
bird 8.4
old 8.4
town 8.3
teamwork 8.3
art 8.3
stage 8.2
rough 8.2
success 8
women 7.9
standing 7.8
sea 7.8
rock 7.8
black 7.8
audience 7.8
student 7.6
ballroom 7.6
outdoors 7.5
street 7.4
new 7.3
structure 7.2
holiday 7.2
summer 7.1
sky 7
textured 7

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

person 99.9
posing 86.8
people 81.3
white 74.7
group 74.4
black 71.6
crowd 69.3
player 64.5
team 28.5
commencement 28.5
event 2.9
research 2.3
music 2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-27
Gender Female, 50.3%
Surprised 49.5%
Happy 49.5%
Angry 49.6%
Confused 49.6%
Disgusted 49.9%
Sad 49.8%
Calm 49.6%

Feature analysis

Amazon

Person
Person 98.7%