Human Generated Data

Title

Untitled (children watching performance by monkey)

Date

c. 1960

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21622

Human Generated Data

Title

Untitled (children watching performance by monkey)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21622

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 98.5
Chair 95.5
Person 95.2
Human 95.2
Person 95.1
Person 92.6
Person 90.6
Person 88.3
Room 85.2
Indoors 85.2
Person 83.4
Sitting 65
Flooring 63.5
Couch 61.7
Living Room 61.3
People 60.4
Clinic 59.9
Floor 57.3
Person 48.9

Clarifai
created on 2023-10-22

people 99.8
chair 98
furniture 97.9
group 97.9
group together 97.6
room 95.8
child 95.8
adult 95.3
man 95.2
seat 95.1
many 94.8
woman 94.4
leader 93.8
education 93.7
indoors 92.5
sit 92.5
school 92.2
administration 90.1
several 86.5
war 83.5

Imagga
created on 2022-03-05

room 40.8
musical instrument 30.4
table 28.6
wind instrument 28.5
chair 27.8
classroom 27
people 26.8
teacher 25.9
interior 25.6
businessman 24.7
office 24.7
business 23.7
man 23.5
person 23.3
male 20.6
brass 20.5
professional 19.2
men 18.9
adult 18.9
meeting 18.8
group 18.5
indoors 17.6
home 17.5
modern 17.5
bowed stringed instrument 16.7
executive 16.4
corporate 16.3
oboe 16
women 15.8
educator 15.2
communication 15.1
stringed instrument 15.1
cornet 14.4
work 14.1
smiling 13.7
sitting 13.7
hall 13.6
businesswoman 12.7
suit 12.6
team 12.5
house 12.5
computer 12.1
restaurant 11.6
furniture 11.5
job 11.5
working 11.5
couple 11.3
floor 11.2
inside 11
indoor 11
glass 10.9
design 10.7
happy 10.6
violin 10.6
together 10.5
education 10.4
desk 10.4
manager 10.2
teamwork 10.2
lifestyle 10.1
board 10
conference 9.8
class 9.6
woodwind 9.6
talking 9.5
day 9.4
worker 9
success 8.8
chairs 8.8
light 8.7
empty 8.6
dining 8.6
sax 8.4
laptop 8.3
window 8.2
style 8.2
student 8
smile 7.8
standing 7.8
students 7.8
drinking 7.7
finance 7.6
businesspeople 7.6
holding 7.4
coffee 7.4
occupation 7.3
girls 7.3
building 7.3
food 7.2

Google
created on 2022-03-05

Furniture 94.8
Chair 91.7
Black 89.5
Black-and-white 87.6
Interior design 84.6
Style 84.1
Art 83.3
Table 83.1
Adaptation 79.2
Monochrome photography 78.1
Monochrome 77.8
Font 71.5
Event 71.1
Room 71.1
Design 68.2
Building 68.1
Sitting 67.2
Suit 67.1
Visual arts 67
Painting 66.5

Microsoft
created on 2022-03-05

floor 90.7
furniture 89.2
indoor 87
table 85.3
person 81.1
chair 72.5
text 68.1
black and white 56.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 99.2%
Sad 51.4%
Happy 19.7%
Confused 12.2%
Calm 8.4%
Disgusted 2.9%
Surprised 2.2%
Angry 2%
Fear 1.1%

AWS Rekognition

Age 18-24
Gender Female, 87.7%
Calm 86.8%
Angry 7.1%
Sad 2.4%
Happy 1.6%
Surprised 0.6%
Fear 0.5%
Disgusted 0.5%
Confused 0.4%

AWS Rekognition

Age 23-31
Gender Male, 80.4%
Sad 94.5%
Calm 3.6%
Confused 1.5%
Angry 0.1%
Fear 0.1%
Surprised 0.1%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-31
Gender Male, 97.1%
Calm 91.8%
Sad 4%
Disgusted 1.6%
Confused 1.1%
Angry 0.6%
Happy 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 40-48
Gender Female, 88.8%
Happy 39.6%
Calm 37.2%
Sad 17.6%
Angry 1.4%
Surprised 1.3%
Confused 1.2%
Disgusted 1.1%
Fear 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair
Person
Chair 95.5%
Person 95.2%
Person 95.1%
Person 92.6%
Person 90.6%
Person 88.3%
Person 83.4%
Person 48.9%

Categories

Text analysis

Amazon

د2
KODAK-SEETA

Google

ra
ra