Human Generated Data

Title

Untitled (four women, women’s club, Windham, NH)

Date

1955

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18572

Human Generated Data

Title

Untitled (four women, women’s club, Windham, NH)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.3
Person 99.3
Person 98
Furniture 97.2
Person 97.2
Person 97.1
Chair 93.6
Table 91.7
Indoors 88.9
Room 88.9
Tabletop 80
Dining Table 73.8
Face 65.1
Musical Instrument 62.1
Leisure Activities 62.1
Piano 62.1
Female 61.6
Flooring 57.3
Floor 55.9
Desk 55.4
Apparel 55.1
Clothing 55.1

Imagga
created on 2022-03-05

brass 100
wind instrument 82.2
trombone 62.6
musical instrument 59.4
man 34.3
cornet 30.9
people 30.1
male 29.8
business 28.5
person 27.7
businessman 27.4
group 25
office 22.5
meeting 21.7
chair 20.8
men 18.9
team 18.8
table 17.3
laptop 17.3
work 17.3
corporate 17.2
professional 16.8
women 16.6
adult 16.5
sax 16.5
businesswoman 16.4
executive 15.8
communication 15.1
job 15
happy 15
room 14
teamwork 13.9
sitting 13.7
computer 13.6
device 13.6
confident 12.7
horn 12.5
worker 12.4
teacher 12
suit 11.7
smiling 11.6
black 11.4
together 11.4
couple 11.3
success 11.3
education 11.3
modern 11.2
manager 11.2
conference 10.7
handsome 10.7
working 10.6
businesspeople 10.4
smile 10
boss 9.6
desk 9.4
study 9.3
successful 9.1
silhouette 9.1
holding 9.1
board 9
cheerful 8.9
discussion 8.8
stage 8.7
employee 8.6
training 8.3
music 8.3
student 8.2
outfit 8.1
interior 8
chatting 7.8
students 7.8
employment 7.7
leader 7.7
collar 7.7
casual 7.6
workplace 7.6
two 7.6
talking 7.6
instrumentality 7.6
leisure 7.5
technology 7.4
musician 7.4
indoor 7.3
lady 7.3
girls 7.3
lifestyle 7.2
looking 7.2
body 7.2
singer 7.2
portrait 7.1
happiness 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

indoor 97.7
text 93.1
table 92
person 91.6
furniture 84.9
clothing 80.6
black and white 64.3

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 89.5%
Happy 92.8%
Surprised 4.3%
Calm 0.9%
Sad 0.7%
Confused 0.5%
Disgusted 0.3%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 49-57
Gender Female, 97.4%
Calm 51%
Happy 36.5%
Sad 7.7%
Confused 1.3%
Surprised 1.2%
Disgusted 1.1%
Fear 0.6%
Angry 0.5%

AWS Rekognition

Age 27-37
Gender Male, 99.9%
Surprised 51.5%
Calm 44.2%
Happy 3.2%
Disgusted 0.3%
Sad 0.2%
Confused 0.2%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 48-54
Gender Male, 99.5%
Happy 38.2%
Calm 24.2%
Sad 24%
Confused 10.5%
Disgusted 1.3%
Surprised 1%
Angry 0.6%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Piano 62.1%

Captions

Microsoft

a group of people in a room 92.8%
a group of people looking at a laptop 66.2%
a group of people sitting at a desk 66.1%

Text analysis

Amazon

KODAK-EIEW

Google

MJI7--YT 37A°2 -- XAGO
MJI7--YT
XAGO
--
37A°2