Human Generated Data

Title

Untitled (several men in uniform playing pool inside pool hall)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13406

Human Generated Data

Title

Untitled (several men in uniform playing pool inside pool hall)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13406

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.4
Human 99.4
Person 99.3
Person 98.5
Person 98.4
Person 98.1
Person 97.8
Boat 94.9
Transportation 94.9
Vehicle 94.9
Person 90.9
Clinic 88.8
Furniture 85.5
Person 84.5
Table 83.1
Building 69.1
Operating Theatre 64.5
Hospital 64.5
Factory 62.5
Lab 56.9
Waterfront 55.7
Water 55.7

Clarifai
created on 2023-10-29

people 99.7
group together 99.4
many 99.3
group 98.5
recreation 98
man 96.5
furniture 94.5
adult 93.9
crowd 92.7
several 87.1
competition 85.3
three 84.4
club 83.7
spectator 83.1
indoors 82.5
woman 82.3
audience 80.7
vehicle 80.4
game 79.3
seat 77

Imagga
created on 2022-03-05

stage 68.9
platform 52
man 22.1
people 19.5
male 16.3
group 15.3
adult 13.3
person 13
equipment 12.9
black 12
water 12
active 12
lifestyle 11.5
business 11.5
light 11.3
teacher 11
modern 10.5
club 10.4
silhouette 9.9
music 9.9
fun 9.7
sexy 9.6
building 9.6
body 9.6
sport 9.6
dark 9.2
professional 9.2
portrait 9
transportation 9
color 8.9
indoors 8.8
concert 8.7
boy 8.7
happiness 8.6
musical 8.6
gym 8.6
boat 8.3
training 8.3
musical instrument 8.3
student 8.1
happy 8.1
fitness 8.1
room 8
performer 7.8
smile 7.8
education 7.8
model 7.8
percussion instrument 7.7
travel 7.7
performance 7.6
sky 7.6
studio 7.6
table 7.6
leisure 7.5
gymnastic apparatus 7.4
musician 7.4
grand piano 7.4
stringed instrument 7.4
lights 7.4
entertainment 7.4
smiling 7.2
board 7.2
looking 7.2
ship 7.2
team 7.2
work 7.1

Microsoft
created on 2022-03-05

black 77.1
text 72.6
black and white 56.3
old 43.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Female, 60.3%
Calm 99.7%
Sad 0.1%
Happy 0.1%
Disgusted 0%
Surprised 0%
Confused 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 28-38
Gender Female, 88.2%
Calm 93.2%
Happy 2.9%
Sad 1%
Fear 0.8%
Angry 0.6%
Disgusted 0.5%
Confused 0.5%
Surprised 0.5%

AWS Rekognition

Age 48-54
Gender Male, 90.8%
Calm 77.9%
Sad 13.4%
Happy 4.6%
Confused 1.9%
Disgusted 0.8%
Fear 0.6%
Surprised 0.4%
Angry 0.4%

AWS Rekognition

Age 34-42
Gender Male, 99.1%
Calm 96%
Sad 1.4%
Disgusted 1%
Surprised 0.4%
Happy 0.4%
Angry 0.3%
Fear 0.2%
Confused 0.2%

AWS Rekognition

Age 23-31
Gender Male, 88.9%
Happy 95.1%
Calm 2.9%
Sad 0.5%
Fear 0.4%
Surprised 0.3%
Disgusted 0.3%
Angry 0.3%
Confused 0.2%

AWS Rekognition

Age 27-37
Gender Female, 97.6%
Sad 65.7%
Calm 25.7%
Surprised 2.4%
Confused 1.8%
Angry 1.2%
Fear 1.1%
Happy 1.1%
Disgusted 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Boat
Person 99.4%
Person 99.3%
Person 98.5%
Person 98.4%
Person 98.1%
Person 97.8%
Person 90.9%
Person 84.5%
Boat 94.9%

Categories

Text analysis

Amazon

YT37A2
MJIR
MJIR YT37A2 ОСЛИА
OJ
ОСЛИА