Human Generated Data

Title

Untitled (group of children on stairs)

Date

1947

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21630

Human Generated Data

Title

Untitled (group of children on stairs)

People

Artist: John Howell, American active 1930s-1960s

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21630

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.8
Human 99.8
Person 99.7
Person 99.5
Person 98.9
Person 98.9
Person 98.8
Person 98.4
Clothing 97.5
Apparel 97.5
Person 95
Person 91.1
Person 90.7
Shoe 90.2
Footwear 90.2
Person 88.6
Shorts 87.7
Person 87.1
Person 85.1
Shoe 79.9
Dress 78.5
Person 77.5
People 76.4
Person 75.1
Crowd 74
Female 70.9
Building 70.8
Outdoors 69.5
Face 66.9
Person 66.5
Leisure Activities 62.9
Girl 61.4
Photography 61.3
Photo 61.3
Coat 61.3
Suit 56.7
Overcoat 56.7
Pants 56.4
Pedestrian 56.3
Musician 56.2
Musical Instrument 56.2
Nature 55.7
Music Band 55.5

Clarifai
created on 2023-10-22

people 100
group together 99.7
many 99.1
man 98.5
group 98.2
recreation 97.4
adult 97.2
child 97
athlete 95.4
woman 94.4
competition 93.4
boy 93.2
spectator 92.9
crowd 91.2
street 90.9
fun 84.5
sport 83.1
monochrome 81.5
motion 78.8
adolescent 76.7

Imagga
created on 2022-03-05

brass 30
wind instrument 26.2
people 23.4
man 22.8
musical instrument 22
sport 18.9
adult 17.6
male 17
bugle 16.6
silhouette 16.5
men 16.3
group 16.1
person 15.6
business 14.6
outdoors 14.2
athlete 14.1
city 14.1
trombone 12.9
travel 12.7
building 12.1
women 11.9
crowd 11.5
outdoor 11.5
boy 11.3
wall 11.1
architecture 10.9
recreation 10.7
businessman 10.6
couple 10.4
walking 10.4
weapon 10.2
competition 10.1
device 10
active 9.7
world 9.4
motion 9.4
work 9.4
exercise 9.1
portrait 9.1
dress 9
human 9
fun 9
success 8.8
office 8.8
day 8.6
corporate 8.6
walk 8.6
black 8.5
old 8.4
summer 8.4
spectator 8.3
ball 8.1
runner 8.1
body 8
lifestyle 7.9
urban 7.9
run 7.7
sky 7.6
happy 7.5
light 7.5
vacation 7.4
fitness 7.2
suit 7.2
to 7.1
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 96.3
outdoor 95.5
person 92.5
clothing 88.1
footwear 70.1
group 57
net 18
clothes 15.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 85.3%
Calm 99.9%
Sad 0%
Happy 0%
Disgusted 0%
Confused 0%
Surprised 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 48-56
Gender Female, 92.2%
Calm 93.8%
Surprised 3.4%
Happy 1.3%
Disgusted 0.6%
Sad 0.3%
Confused 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 45-53
Gender Female, 99.8%
Calm 96.8%
Fear 0.9%
Surprised 0.9%
Confused 0.4%
Disgusted 0.3%
Happy 0.3%
Sad 0.3%
Angry 0.2%

AWS Rekognition

Age 26-36
Gender Male, 88.4%
Calm 99.8%
Surprised 0.1%
Happy 0%
Confused 0%
Fear 0%
Disgusted 0%
Sad 0%
Angry 0%

AWS Rekognition

Age 18-24
Gender Female, 100%
Calm 98.4%
Surprised 0.7%
Fear 0.6%
Sad 0.2%
Happy 0.1%
Disgusted 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 35-43
Gender Female, 99.8%
Calm 96%
Happy 1.7%
Surprised 1.2%
Sad 0.3%
Confused 0.3%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 30-40
Gender Female, 97.9%
Calm 83%
Surprised 14.5%
Happy 0.9%
Fear 0.7%
Sad 0.3%
Disgusted 0.3%
Angry 0.2%
Confused 0.1%

AWS Rekognition

Age 35-43
Gender Male, 50.2%
Calm 99.9%
Surprised 0.1%
Sad 0%
Confused 0%
Angry 0%
Fear 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 36-44
Gender Female, 89.3%
Calm 99.7%
Happy 0.2%
Sad 0.1%
Fear 0%
Disgusted 0%
Confused 0%
Surprised 0%
Angry 0%

AWS Rekognition

Age 26-36
Gender Male, 55.6%
Happy 56.1%
Surprised 15.8%
Calm 13.1%
Confused 5.4%
Angry 4.4%
Disgusted 2.1%
Fear 1.7%
Sad 1.4%

AWS Rekognition

Age 24-34
Gender Male, 52.9%
Calm 99.8%
Happy 0.1%
Confused 0%
Surprised 0%
Sad 0%
Disgusted 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 30-40
Gender Female, 96.8%
Calm 85.6%
Happy 6%
Disgusted 2.8%
Surprised 2.4%
Sad 1%
Fear 0.9%
Confused 0.7%
Angry 0.5%

AWS Rekognition

Age 28-38
Gender Female, 95.5%
Calm 98.4%
Sad 0.4%
Happy 0.3%
Surprised 0.2%
Confused 0.2%
Fear 0.2%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 33-41
Gender Male, 99.1%
Calm 95.8%
Sad 2.1%
Confused 0.5%
Surprised 0.5%
Disgusted 0.4%
Happy 0.3%
Fear 0.2%
Angry 0.2%

AWS Rekognition

Age 29-39
Gender Female, 97.5%
Calm 90.2%
Surprised 7.9%
Sad 1%
Disgusted 0.2%
Happy 0.2%
Angry 0.2%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 43-51
Gender Female, 60.9%
Calm 98.3%
Sad 0.5%
Happy 0.5%
Confused 0.2%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.8%
Person 99.7%
Person 99.5%
Person 98.9%
Person 98.9%
Person 98.8%
Person 98.4%
Person 95%
Person 91.1%
Person 90.7%
Person 88.6%
Person 87.1%
Person 85.1%
Person 77.5%
Person 75.1%
Person 66.5%
Shoe 90.2%
Shoe 79.9%

Categories

Text analysis

Amazon

ROHRO
20
SAL