Human Generated Data

Title

Untitled (group of boys at bar mitzvah)

Date

1956

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19007

Human Generated Data

Title

Untitled (group of boys at bar mitzvah)

People

Artist: Bachrach Studios, founded 1868

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19007

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.1
Human 99.1
Furniture 99.1
Person 98.9
Clothing 98.5
Apparel 98.5
Person 97.7
Person 97.3
Chair 96.9
Person 94
Shorts 91.2
Person 90.1
Person 89.7
Chair 85.5
Tie 83.7
Accessories 83.7
Accessory 83.7
Person 80.2
Face 79.3
Tie 78.9
Table 74.7
Shirt 74.2
Person 73.7
Dining Table 70.6
People 70.5
Sleeve 70.1
Suit 67.4
Coat 67.4
Overcoat 67.4
Photography 67.1
Photo 67.1
Portrait 66.8
Female 64.3
Person 61.7
Long Sleeve 60.5
Sitting 59.8
Plant 55.9
Pants 55.6
Flower 55.4
Blossom 55.4
Person 54.6

Clarifai
created on 2023-10-22

people 99.8
group together 99
group 98.4
man 97.4
adult 96.4
woman 96.2
leader 95.1
three 93.1
several 90.8
chair 90.5
administration 90.1
wear 87.2
five 86.1
two 86
many 85.6
child 84.8
medical practitioner 83.9
four 83.2
boxer 82.8
facial expression 82.5

Imagga
created on 2022-03-05

man 34.2
male 27.7
person 26.9
people 22.9
lifestyle 18.1
adult 18
ball 18
professional 17.9
health 13.9
nurse 13.9
basketball 13.6
sport 13.3
black 13.2
men 12.9
musical instrument 12.8
patient 12
hospital 11.8
brass 11.1
work 11.1
business 10.9
active 10.7
wind instrument 10.7
happy 10.6
businessman 10.6
device 10.5
portrait 10.3
casual 10.2
leisure 10
body 9.6
play 9.5
equipment 9.4
basketball equipment 9.4
competition 9.1
fitness 9
team 9
game 8.9
group 8.9
job 8.8
looking 8.8
home 8.8
women 8.7
boy 8.7
education 8.7
student 8.6
world 8.6
doctor 8.5
hand 8.3
life 8.3
holding 8.2
occupation 8.2
exercise 8.2
school 8.1
handsome 8
game equipment 8
room 8
working 7.9
medical 7.9
player 7.8
teacher 7.7
two 7.6
uniform 7.6
planner 7.6
sports equipment 7.6
healthy 7.6
action 7.4
playing 7.3
smiling 7.2
racket 7.2
worker 7.1
indoors 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97.7
clothing 93.3
person 93.2
man 90.4
black and white 54.6
crowd 0.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Female, 63.3%
Calm 95.5%
Happy 3.7%
Sad 0.4%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Sad 93.9%
Calm 2.6%
Angry 1.3%
Confused 0.9%
Happy 0.4%
Surprised 0.4%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 27-37
Gender Male, 94.5%
Happy 80.9%
Calm 16.5%
Sad 0.8%
Surprised 0.5%
Disgusted 0.4%
Angry 0.4%
Confused 0.3%
Fear 0.2%

AWS Rekognition

Age 33-41
Gender Male, 98.3%
Sad 53%
Surprised 16.1%
Calm 13%
Confused 5.7%
Disgusted 3.7%
Happy 3.3%
Fear 2.9%
Angry 2.2%

AWS Rekognition

Age 26-36
Gender Male, 96.7%
Calm 84%
Surprised 9.6%
Happy 3.6%
Disgusted 0.7%
Confused 0.7%
Sad 0.6%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 35-43
Gender Male, 99.9%
Calm 95.6%
Surprised 1.3%
Sad 1.2%
Confused 0.5%
Disgusted 0.5%
Angry 0.5%
Happy 0.3%
Fear 0.1%

AWS Rekognition

Age 36-44
Gender Male, 99.8%
Sad 78.4%
Confused 6.6%
Calm 6.4%
Angry 2.1%
Happy 2%
Disgusted 2%
Surprised 1.4%
Fear 1.2%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Sad 47.8%
Calm 26.7%
Surprised 10.7%
Angry 4.1%
Disgusted 3.7%
Fear 2.7%
Happy 2.2%
Confused 2.1%

AWS Rekognition

Age 33-41
Gender Male, 99.8%
Sad 75.1%
Calm 9%
Surprised 4.8%
Fear 3.8%
Happy 2.5%
Angry 2.4%
Confused 1.4%
Disgusted 1%

AWS Rekognition

Age 45-51
Gender Male, 100%
Happy 26.9%
Calm 21.8%
Sad 19.3%
Surprised 10.9%
Confused 10.7%
Angry 4.9%
Disgusted 4.5%
Fear 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Tie
Person 99.1%
Person 98.9%
Person 97.7%
Person 97.3%
Person 94%
Person 90.1%
Person 89.7%
Person 80.2%
Person 73.7%
Person 61.7%
Person 54.6%
Chair 96.9%
Chair 85.5%
Tie 83.7%
Tie 78.9%

Text analysis

Amazon

1
KODA
FILE
EE
4/3
SKFE