Human Generated Data

Title

Untitled (three boys posing on spiral staircase)

Date

1965

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16767

Human Generated Data

Title

Untitled (three boys posing on spiral staircase)

People

Artist: Lucian and Mary Brown, American

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16767

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.7
Human 99.7
Person 99.5
Person 98.5
Acrobatic 96.4
Shoe 94.3
Clothing 94.3
Footwear 94.3
Apparel 94.3
Leisure Activities 69.2
Gymnastics 58.6
Sport 58.6
Sports 58.6
Gymnast 57.9
Athlete 57.9

Clarifai
created on 2023-10-29

people 99.8
group together 97.5
man 96.3
street 95.8
woman 95.6
monochrome 95
adult 94.8
two 93.9
wedding 93.5
child 92
dancer 91.5
step 91.3
group 91.1
dancing 89.9
recreation 89.9
wear 87.7
couple 86.5
actress 85.7
girl 84.6
three 84.1

Imagga
created on 2022-02-26

people 20.1
man 18.8
room 18.7
person 17.7
window 17.1
indoors 16.7
patient 16.4
chair 15.3
male 15
dress 13.5
adult 13.2
wall 12.8
home 12.8
building 12.8
women 12.6
interior 12.4
portrait 12.3
house 11.7
hospital 11.4
black 10.8
happy 10.6
medical 10.6
modern 10.5
couple 10.4
old 10.4
equipment 10.4
men 10.3
device 10
light 10
care 9.9
bride 9.6
furniture 9.5
lifestyle 9.4
nurse 9.4
face 9.2
worker 9.2
health 9
life 8.9
smiling 8.7
wedding 8.3
musical instrument 8.2
girls 8.2
office 8.2
clinic 8
architecture 7.9
love 7.9
smile 7.8
scene 7.8
glass 7.8
luxury 7.7
seat 7.7
relaxation 7.5
brass 7.5
traditional 7.5
art 7.3
decoration 7.2
backboard 7.1
work 7.1
family 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.7
person 92.9
outdoor 85.5
clothing 81.4
man 53.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 57%
Happy 88.7%
Calm 6.8%
Fear 1.3%
Surprised 1.3%
Sad 1.1%
Confused 0.3%
Angry 0.3%
Disgusted 0.3%

AWS Rekognition

Age 23-31
Gender Male, 99.7%
Calm 56.9%
Happy 38.8%
Surprised 1.5%
Fear 0.9%
Sad 0.8%
Confused 0.5%
Angry 0.3%
Disgusted 0.3%

AWS Rekognition

Age 18-26
Gender Female, 98.4%
Happy 59.7%
Calm 16.2%
Sad 11.7%
Surprised 5%
Disgusted 3%
Fear 2.1%
Angry 1.4%
Confused 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.7%
Person 99.5%
Person 98.5%
Shoe 94.3%

Categories

Text analysis

Amazon

2
KADOK

Google

MJI7--YT3RA°2 YAGON
MJI7--YT3RA°2
YAGON