Human Generated Data

Title

Untitled (women playing cards and wearing hats)

Date

1948

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19344

Human Generated Data

Title

Untitled (women playing cards and wearing hats)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19344

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 99.5
Person 98.7
Clothing 98.5
Apparel 98.5
Person 98.2
Person 98.2
Person 96.8
Person 96.4
Table 94.1
Furniture 94.1
Face 89.3
Chair 87.3
Female 86.3
Meal 85.1
Food 85.1
Person 84.5
Dining Table 82.5
Sitting 81.5
People 79.9
Helmet 74.7
Person 73.8
Dish 72.3
Suit 71.6
Coat 71.6
Overcoat 71.6
Woman 69.7
Photography 69.1
Photo 69.1
Restaurant 67.9
Crowd 67.2
Portrait 67.1
Girl 64.7
Dress 63.4
Hat 62.1
Tablecloth 61.4
Indoors 56.3

Clarifai
created on 2023-10-22

people 99.9
child 99
group 98.9
woman 98.4
adult 97.6
man 97.4
group together 96.9
boy 94.6
furniture 94.5
education 93.8
family 93.6
wear 90
sit 87.7
many 87.6
monochrome 87.5
administration 87
teacher 85.6
five 83.6
indoors 83.3
several 82.6

Imagga
created on 2022-03-05

percussion instrument 82.9
marimba 77.5
musical instrument 72
man 36.2
classroom 33.4
people 28.4
male 26.9
room 24.4
group 21.7
person 21.5
business 21.2
education 20.8
businessman 20.3
teacher 19.6
table 19
adult 18.6
school 17.3
men 17.2
team 17
sitting 16.3
student 16.3
happy 16.3
work 15.8
smiling 15.2
office 14.6
brass 14.5
lifestyle 14.4
meeting 14.1
chair 13.4
together 13.1
teamwork 13
colleagues 12.6
job 12.4
interior 12.4
holding 12.4
boy 12.2
worker 12.1
blackboard 11.9
women 11.9
wind instrument 11.2
child 10.9
businesswoman 10.9
board 10.8
teaching 10.7
working 10.6
30s 10.6
talking 10.4
businesspeople 10.4
desk 10.4
smile 10
modern 9.8
professional 9.8
kid 9.7
indoors 9.7
class 9.6
diversity 9.6
corporate 9.4
executive 9.4
learning 9.4
study 9.3
communication 9.2
indoor 9.1
laptop 9.1
steel drum 9
black 9
cheerful 8.9
to 8.8
diverse 8.8
40s 8.7
studying 8.6
happiness 8.6
20s 8.2
children 8.2
family 8
couple 7.8
device 7.8
standing 7.8
portrait 7.8
four 7.7
two 7.6
workplace 7.6
enjoyment 7.5
vibraphone 7.4
mature 7.4
confident 7.3
color 7.2
cornet 7.2
home 7.2
handsome 7.1
restaurant 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 97.8
text 96.6
table 92.8
clothing 92.4
outdoor 89.1
man 81.3
drawing 60.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 91.1%
Happy 91%
Surprised 5.8%
Calm 1.5%
Disgusted 0.7%
Sad 0.5%
Fear 0.3%
Confused 0.2%
Angry 0.1%

AWS Rekognition

Age 36-44
Gender Male, 99.7%
Happy 38.8%
Calm 37.3%
Surprised 18.8%
Fear 1.6%
Sad 1.3%
Confused 1%
Disgusted 0.6%
Angry 0.5%

AWS Rekognition

Age 30-40
Gender Female, 71.6%
Surprised 64.3%
Happy 28.8%
Calm 3.4%
Sad 1%
Disgusted 0.9%
Confused 0.6%
Angry 0.5%
Fear 0.3%

AWS Rekognition

Age 45-53
Gender Male, 89.8%
Happy 97.1%
Surprised 1.4%
Calm 0.6%
Sad 0.3%
Confused 0.3%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 36-44
Gender Male, 99.8%
Surprised 59.4%
Happy 25.9%
Sad 4.9%
Confused 4.1%
Calm 2.3%
Disgusted 1.5%
Fear 1.3%
Angry 0.6%

AWS Rekognition

Age 33-41
Gender Male, 97.6%
Happy 60.7%
Surprised 18.3%
Calm 16.5%
Fear 1.6%
Sad 0.9%
Disgusted 0.7%
Angry 0.7%
Confused 0.6%

AWS Rekognition

Age 49-57
Gender Male, 93.4%
Sad 98.3%
Calm 0.6%
Happy 0.5%
Fear 0.2%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Person 99.6%
Person 99.5%
Person 98.7%
Person 98.2%
Person 98.2%
Person 96.8%
Person 96.4%
Person 84.5%
Person 73.8%
Helmet 74.7%

Categories

Imagga

people portraits 82.8%
events parties 16%

Text analysis

Amazon

23
500
EG 500
EG
KODAKS

Google

23
23