Human Generated Data

Title

Untitled (five children gathered around nativity scene model in front of fireplace)

Date

1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9749

Human Generated Data

Title

Untitled (five children gathered around nativity scene model in front of fireplace)

People

Artist: Martin Schweig, American 20th century

Date

1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9749

Machine Generated Data

Tags

Amazon
created on 2022-01-24

Human 99.4
Person 99
Person 95.9
Person 94.6
Person 92.2
Game 88.9
Chess 71.4
Person 68.7
People 59.2

Clarifai
created on 2023-10-27

people 99.9
group 99.5
child 99.1
man 97.8
adult 97.7
group together 96.6
three 96.5
vehicle 95.3
two 94.7
nostalgia 94.6
monochrome 94.2
boy 93.5
leader 91.9
woman 90.4
several 90.3
four 90.1
education 90.1
chair 89.9
administration 86.9
music 86.1

Imagga
created on 2022-01-24

brass 58.8
wind instrument 50.5
musical instrument 35.6
cornet 29.1
person 25.1
man 23.5
people 22.3
sax 21.2
business 20.6
businessman 20.3
male 19.8
work 15.1
adult 14.7
job 13.3
senior 12.2
blackboard 11.6
bass 11.5
hand 11.4
design 11.2
looking 11.2
technology 10.4
home 10.4
money 10.2
finance 10.1
symbol 10.1
room 9.9
team 9.8
old 9.7
group 9.7
education 9.5
men 9.4
smiling 9.4
newspaper 9.2
holding 9.1
sign 9
teacher 9
human 9
professional 8.9
player 8.9
classroom 8.6
paper 8.6
chart 8.6
drawing 8.6
architecture 8.6
bright 8.6
writing 8.6
portrait 8.4
silhouette 8.3
bank 8.2
one 8.2
laptop 8.2
office 8.2
happy 8.1
financial 8
computer 8
worker 8
working 7.9
audience 7.8
modern 7.7
skill 7.7
crowd 7.7
patriotic 7.7
nation 7.6
product 7.4
building 7.4
school 7.4
occupation 7.3
time 7.3
student 7.2
success 7.2
board 7.2
currency 7.2

Google
created on 2022-01-24

Microsoft
created on 2022-01-24

text 98.8
clothing 84
person 79.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 97.8%
Calm 96.4%
Sad 2.7%
Happy 0.5%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Male, 93.6%
Calm 68.8%
Surprised 24.5%
Angry 2.8%
Fear 1.7%
Disgusted 0.8%
Happy 0.6%
Sad 0.5%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99%
Person 95.9%
Person 94.6%
Person 92.2%
Person 68.7%

Captions

Text analysis

Amazon

:
1
a
MJ17--YT37A°S--