Human Generated Data

Title

Untitled (man and children with large model train)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17821

Human Generated Data

Title

Untitled (man and children with large model train)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17821

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.6
Human 99.6
Person 99.4
Person 99.4
Person 98.6
Person 98.1
Person 97.2
Person 96
Person 95.9
Person 93.7
Building 80.3
Lab 74
Meal 71.6
Food 71.6
Factory 68.5
Cafeteria 65.2
Restaurant 65.2
Train 63.2
Transportation 63.2
Vehicle 63.2
LCD Screen 62.2
Electronics 62.2
Screen 62.2
Monitor 62.2
Display 62.2
Clinic 60.7
Worker 56
Workshop 55.3
Person 45.3

Clarifai
created on 2023-10-28

people 99.7
man 98.2
woman 97.7
adult 97.4
group 97.3
indoors 96.1
group together 94.7
monochrome 93.8
administration 92.7
room 92.2
many 91.5
chair 91.3
furniture 91.3
leader 90.8
sit 87.4
several 86.2
five 83.3
education 80.9
war 80.5
child 78

Imagga
created on 2022-02-26

man 31.6
male 26.2
people 22.8
interior 21.2
smiling 20.2
business 20
adult 19.8
person 19.7
brass 19.3
musical instrument 18.2
hospital 17.6
happy 17.5
women 17.4
office 17.2
trombone 17.2
room 16.7
indoors 16.7
table 16.6
modern 16.1
wind instrument 15.5
men 15.4
home 15.1
teacher 14.6
nurse 14.2
work 14.1
businessman 14.1
meeting 13.2
cheerful 13
worker 12.2
couple 12.2
happiness 11.7
glass 11.7
classroom 11.6
together 11.4
group 11.3
education 11.2
sitting 11.2
lifestyle 10.8
chair 10.7
smile 10.7
life 10.5
standing 10.4
two 10.2
communication 10.1
holding 9.9
team 9.8
professional 9.8
to 9.7
luxury 9.4
day 9.4
house 9.2
hall 9
transportation 9
working 8.8
patient 8.8
class 8.7
clinic 8.7
corporate 8.6
businesspeople 8.5
casual 8.5
enjoyment 8.4
teamwork 8.3
percussion instrument 8.3
student 8.3
school 8.3
human 8.2
indoor 8.2
counter 8
furniture 7.9
love 7.9
colleagues 7.8
portrait 7.8
old 7.7
togetherness 7.5
senior 7.5
marimba 7.5
blackboard 7.4
businesswoman 7.3
building 7.2
board 7.2
holiday 7.2
travel 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

indoor 93.2
table 88.3
person 78.9
text 70
preparing 43.8
cooking 23.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 73.6%
Calm 98.8%
Sad 0.4%
Disgusted 0.2%
Happy 0.1%
Fear 0.1%
Surprised 0.1%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 29-39
Gender Female, 55.6%
Calm 47.1%
Sad 21.9%
Happy 11.1%
Confused 9.5%
Angry 2.9%
Surprised 2.8%
Fear 2.5%
Disgusted 2.2%

AWS Rekognition

Age 41-49
Gender Male, 75.7%
Calm 98.7%
Surprised 0.4%
Sad 0.4%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 26-36
Gender Male, 98.1%
Calm 63.5%
Sad 36%
Angry 0.2%
Confused 0.1%
Happy 0.1%
Fear 0.1%
Disgusted 0%
Surprised 0%

AWS Rekognition

Age 31-41
Gender Male, 99.1%
Calm 76.8%
Sad 18.5%
Disgusted 1.1%
Confused 1%
Surprised 0.9%
Happy 0.7%
Angry 0.6%
Fear 0.4%

AWS Rekognition

Age 19-27
Gender Female, 90.6%
Calm 58.3%
Sad 19%
Happy 17.6%
Fear 1.3%
Surprised 1.3%
Angry 0.9%
Disgusted 0.8%
Confused 0.8%

AWS Rekognition

Age 14-22
Gender Female, 85.1%
Calm 60%
Happy 23.5%
Angry 8.9%
Sad 2.5%
Disgusted 1.5%
Confused 1.2%
Surprised 1.2%
Fear 1.1%

AWS Rekognition

Age 11-19
Gender Female, 51.9%
Calm 69.6%
Sad 27.1%
Confused 0.9%
Disgusted 0.8%
Angry 0.5%
Happy 0.4%
Fear 0.4%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Train
Person 99.6%
Person 99.4%
Person 99.4%
Person 98.6%
Person 98.1%
Person 97.2%
Person 96%
Person 95.9%
Person 93.7%
Person 45.3%
Train 63.2%

Text analysis

Amazon

Pueblos
Land of Pueblos
3.
Land
of
KODVR
THE

Google

Land of Pucblos Grand CarU 3.
Land
of
Pucblos
Grand
CarU
3.