Human Generated Data

Title

Untitled (women at table at party)

Date

1964

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19200

Human Generated Data

Title

Untitled (women at table at party)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19200

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Chair 99.3
Furniture 99.3
Restaurant 98.9
Person 98.3
Human 98.3
Person 96.3
Chair 95.8
Person 93
Cafeteria 86.4
Chair 81.5
Cafe 79.3
Food Court 78.7
Food 78.7
Sitting 78.3
Flooring 78.1
Meal 74.4
Person 72.4
Table 72.2
People 62.6
Dining Table 58.6

Clarifai
created on 2023-10-22

people 99.8
adult 98.3
group 98.2
woman 98
man 97.5
group together 94.9
furniture 93.2
retro 90.1
vehicle 89.7
transportation system 89.2
four 88.3
several 87.5
sit 85.9
seat 85.8
recreation 84.1
wear 84
three 83.7
portrait 82.9
child 81.4
vintage 80.3

Imagga
created on 2022-02-25

home 33.5
man 30.2
passenger 27.8
people 26.2
adult 24.2
person 23.8
happy 23.8
smiling 23.1
room 21.5
male 21.3
couple 20.9
wheeled vehicle 20.5
house 19.2
men 18
office 17.9
business 17.6
indoors 17.6
sitting 16.3
interior 15.9
smile 15.7
businessman 15
together 14.9
modern 14.7
lifestyle 14.4
portrait 14.2
women 14.2
vehicle 13.9
mobile home 13.6
sofa 13.5
family 13.3
chair 13.1
building 13.1
window 13
happiness 12.5
structure 12.2
work 11.8
cheerful 11.4
trailer 10.8
living room 10.8
transportation 10.7
professional 10.7
housing 10.7
couch 10.6
patient 10.2
two 10.2
communication 10.1
indoor 10
conveyance 9.9
living 9.5
enjoying 9.5
corporate 9.4
luxury 9.4
senior 9.4
child 9.3
back 9.2
travel 9.1
car 9.1
new 8.9
computer 8.9
working 8.8
looking 8.8
apartment 8.6
comfortable 8.6
casual 8.5
nurse 8.4
transport 8.2
team 8.1
love 7.9
standing 7.8
education 7.8
loving 7.6
talking 7.6
togetherness 7.5
life 7.5
fun 7.5
leisure 7.5
outdoors 7.5
mature 7.4
laptop 7.4
teamwork 7.4
care 7.4
occupation 7.3
children 7.3
school 7.2
worker 7.2
holiday 7.2
bright 7.1
to 7.1
job 7.1

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 98.9
clothing 92.2
person 91.1
furniture 88.9
woman 83.8
table 62.8
chair 60.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Female, 100%
Sad 50.4%
Disgusted 21.3%
Happy 14.4%
Angry 3.6%
Surprised 3.1%
Calm 2.9%
Fear 2.8%
Confused 1.6%

AWS Rekognition

Age 14-22
Gender Female, 99.7%
Happy 56.4%
Calm 27.8%
Angry 4.3%
Disgusted 4.2%
Sad 3.5%
Surprised 2%
Confused 1.3%
Fear 0.6%

AWS Rekognition

Age 24-34
Gender Female, 97.5%
Calm 65.4%
Fear 26.7%
Sad 2.3%
Angry 1.9%
Surprised 1.6%
Disgusted 1.3%
Confused 0.5%
Happy 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair
Person
Chair 99.3%
Chair 95.8%
Chair 81.5%
Person 98.3%
Person 96.3%
Person 93%
Person 72.4%

Categories

Text analysis

Amazon

48
DEC
64
131

Google

DEC 48 131
DEC
48
131