Human Generated Data

Title

Untitled (overhead view of ping pong match with onlookers inside warehouse)

Date

1951

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9433

Human Generated Data

Title

Untitled (overhead view of ping pong match with onlookers inside warehouse)

People

Artist: Martin Schweig, American 20th century

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9433

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.7
Human 99.7
Person 99.4
Person 99
Person 98.7
Person 97.9
Person 97.2
Person 96.4
Sport 95.7
Sports 95.7
Person 94.8
Person 93.9
Person 91.2
Person 89.4
Ping Pong 87.9
Person 85.9
Person 85.4
Person 68.2
Person 63.2
Person 62.7

Clarifai
created on 2023-10-26

people 99.5
group 98.1
man 95
adult 94.1
woman 91.9
indoors 91.6
furniture 90.5
monochrome 89.6
group together 88.6
room 88
many 83.3
child 83.3
education 77.3
family 76.8
leader 76.5
music 73.5
five 72.3
administration 69.6
classroom 68.3
interaction 67.5

Imagga
created on 2022-01-23

room 45.9
table 36
interior 35.3
classroom 24.7
people 24.5
home 23.9
modern 23.1
furniture 22.9
house 20.9
chair 20.8
indoors 20.2
indoor 19.2
person 19
decor 18.6
male 18.4
business 18.2
man 17.5
musical instrument 16.9
businessman 16.8
glass 16.5
percussion instrument 16.5
gymnasium 16.4
group 16.1
floor 15.8
design 15.7
office 15.7
inside 15.6
desk 14.8
kitchen 14.3
work 14.3
adult 13.8
happy 13.8
men 13.7
apartment 13.4
athletic facility 13.2
decoration 13.1
smiling 13
architecture 12.5
teacher 12.1
domestic 11.7
education 11.2
luxury 11.1
marimba 11.1
center 11.1
women 11.1
executive 11
day 11
wood 10.8
worker 10.7
school 10.7
cheerful 10.6
meeting 10.4
corporate 10.3
facility 10.1
light 10
businesswoman 10
hall 9.6
lamp 9.5
living 9.5
sitting 9.4
lifestyle 9.4
negative 9.3
window 9.2
team 8.9
new 8.9
family 8.9
life 8.9
couple 8.7
residential 8.6
businesspeople 8.5
study 8.4
counter 8.3
stove 8.3
holding 8.2
clothing 8.1
equipment 8.1
student 8
blackboard 8
cabinet 8
conference 7.8
space 7.8
3d 7.7
class 7.7
professional 7.7
elegant 7.7
sofa 7.6
comfortable 7.6
estate 7.6
board 7.5
film 7.5
manager 7.4
vibraphone 7.4
mature 7.4
metal 7.2
computer 7.2
suit 7.2
portrait 7.1
job 7.1
happiness 7
together 7

Google
created on 2022-01-23

Table 88.5
Snapshot 74.3
Event 69.5
Room 67.8
Uniform 65.9
Monochrome 65.7
Art 65.6
Team 64.1
Machine 63.8
Net 60.5
History 59.9
Illustration 58.3
Monochrome photography 57.8
Crew 56.3
Rectangle 54.5

Microsoft
created on 2022-01-23

person 99.7
indoor 92.9
tennis 92.8
text 85.8
ping pong 83.8
group 82.1
table tennis racket 82
racket 79.1
racquet sport 74.8
people 72.5
indoor games and sports 60.6
table 52.5
ball 51.3
old 46.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 63.6%
Calm 99.5%
Happy 0.1%
Disgusted 0.1%
Sad 0.1%
Surprised 0.1%
Fear 0.1%
Confused 0%
Angry 0%

AWS Rekognition

Age 37-45
Gender Male, 67.4%
Calm 80.9%
Sad 13.5%
Confused 3.3%
Angry 0.9%
Happy 0.7%
Disgusted 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 36-44
Gender Female, 88.9%
Calm 89.4%
Sad 9%
Confused 0.6%
Happy 0.3%
Angry 0.3%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 31-41
Gender Female, 52.6%
Sad 89.6%
Happy 4.3%
Confused 2.1%
Calm 1.7%
Disgusted 0.7%
Angry 0.7%
Fear 0.5%
Surprised 0.4%

AWS Rekognition

Age 28-38
Gender Male, 88.7%
Sad 93%
Confused 3.4%
Happy 1.3%
Calm 1.1%
Angry 0.5%
Disgusted 0.4%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Male, 85.1%
Sad 80.5%
Calm 10.1%
Confused 3.6%
Angry 2.1%
Happy 1.6%
Disgusted 1%
Fear 0.6%
Surprised 0.5%

AWS Rekognition

Age 24-34
Gender Male, 89.2%
Sad 65.9%
Calm 18.9%
Confused 5.4%
Disgusted 3.4%
Angry 3.2%
Fear 1.5%
Happy 1.1%
Surprised 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person
Person 99.7%
Person 99.4%
Person 99%
Person 98.7%
Person 97.9%
Person 97.2%
Person 96.4%
Person 94.8%
Person 93.9%
Person 91.2%
Person 89.4%
Person 85.9%
Person 85.4%
Person 68.2%
Person 63.2%
Person 62.7%

Categories

Text analysis

Amazon

3
N
8 3
de
8
KODVK-SELA

Google

YT37A2-XAGON
YT37A2-XAGON