Human Generated Data

Title

Untitled (group portrait of children in sixteen children in front of Christmas tree in house)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9141

Human Generated Data

Title

Untitled (group portrait of children in sixteen children in front of Christmas tree in house)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9141

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 97.9
Person 97.8
Person 97.5
Person 97.4
Person 97.4
Person 96.2
Person 96.2
Person 96.1
Person 95.5
Person 94.1
Person 92
Leisure Activities 90.6
Musician 90.4
Musical Instrument 90.4
Person 89.8
Person 84.5
People 74.1
Guitar 73.4
Room 69.7
Indoors 69.7
Text 69.2
Person 66.7
Girl 60.7
Female 60.7
Music Band 59.5

Clarifai
created on 2023-10-27

people 100
group 99.7
adult 98.7
many 98.5
child 97.8
music 97.7
woman 97.7
man 97
musician 95.8
wear 95.7
group together 95
stringed instrument 93.9
instrument 93.5
education 93.3
recreation 91
sit 90.6
furniture 89.3
administration 88.9
sitting 86.1
room 85.9

Imagga
created on 2022-01-23

people 28.4
person 23.9
musical instrument 23.1
brass 22.2
man 22.2
male 20.6
classroom 19.1
room 18.7
group 17.7
wind instrument 17.6
adult 17.5
blackboard 16.5
business 16.4
men 16.3
black 16.2
teacher 15.3
women 15
chair 14
office 13.9
education 13.8
silhouette 13.2
computer 13
modern 12.6
music 12.3
indoor 11.9
class 11.6
lifestyle 11.6
percussion instrument 11.5
businessman 11.5
indoors 11.4
board 10.8
student 10.8
school 10.6
working 10.6
style 10.4
team 9.8
window 9.7
students 9.7
teaching 9.7
interior 9.7
equipment 9.7
desk 9.6
life 9.5
motion 9.4
youth 9.4
hand 9.1
trombone 9.1
night 8.9
home 8.8
table 8.8
shop 8.7
happiness 8.6
musician 8.6
sitting 8.6
art 8.5
barbershop 8.4
human 8.2
television 8.2
technology 8.2
happy 8.1
looking 8
smiling 7.9
laptop 7.9
party 7.7
studying 7.7
singer 7.7
casual 7.6
dance 7.6
college 7.6
city 7.5
monitor 7.4
marimba 7.4
star 7.3
portrait 7.1
cornet 7.1
kid 7.1
work 7.1
together 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 93.3%
Calm 96.9%
Surprised 1.2%
Disgusted 0.5%
Sad 0.4%
Confused 0.3%
Happy 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 35-43
Gender Male, 100%
Surprised 78.7%
Calm 6.4%
Sad 6.3%
Happy 2.9%
Angry 2.3%
Fear 1.5%
Disgusted 1.2%
Confused 0.7%

AWS Rekognition

Age 30-40
Gender Male, 96.7%
Calm 86.6%
Sad 4.1%
Surprised 3.3%
Fear 2.1%
Confused 1.4%
Happy 1.2%
Disgusted 0.9%
Angry 0.5%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Calm 70.9%
Surprised 18.8%
Happy 5%
Angry 1.6%
Fear 1.2%
Disgusted 0.9%
Sad 0.8%
Confused 0.8%

AWS Rekognition

Age 37-45
Gender Male, 99.8%
Calm 77.4%
Sad 20.1%
Happy 0.6%
Fear 0.5%
Angry 0.4%
Disgusted 0.4%
Confused 0.3%
Surprised 0.3%

AWS Rekognition

Age 36-44
Gender Male, 99.5%
Calm 81.9%
Happy 12.9%
Sad 1.3%
Surprised 1.1%
Confused 0.8%
Disgusted 0.7%
Angry 0.7%
Fear 0.5%

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Calm 86.3%
Happy 5.9%
Confused 2.4%
Sad 2.3%
Angry 1.1%
Disgusted 1%
Surprised 0.7%
Fear 0.3%

AWS Rekognition

Age 42-50
Gender Male, 97.7%
Calm 91.6%
Sad 4.8%
Disgusted 1.2%
Confused 0.9%
Surprised 0.9%
Angry 0.3%
Happy 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Text analysis

Amazon

MJI7
MJI7 ОСЛИА
ОСЛИА
$ 911

Google

MJI7 YT3RA2
MJI7
YT3RA2